Toppan Digital Language

eBay Using Machine Translation to Support Russian Expansion

eBay machine translation

eBay machine translation

Any mobile user who has sent or received predictive texts will know the exasperation there can be in getting technology to communicate exactly what it is you want to say. Indeed, conveying messages across the worldwide web to consumers in different countries faces one major obstacle – language.

Global websites aren’t written in a way that all consumers can understand. And if consumers can’t understand the language, then the company’s potential customer base and revenue are limited.

Twitter, Facebook and eBay are among the huge companies using machine translation to try and get round these issues by translating website content for their customers.

eBay knows that language issues are costing it money. Now the eCommerce giant’s new machine translation director Hassan Sawaf is trialling a default program on its Russian site in a bid to target additional customers.

Sawaf, with a background in developing computers to improve their understanding of human language, wants to enhance the science of online translation.

For example, eBay’s now has an automated tool which translates listings into Russian from other languages. Russians wanting to buy, for example, a polo shirt from England, will view a version translated by machine from the English listing into Russian. Other languages include Spanish, Chinese (Mandarin) and German.

How is eBay measuring the success of their machine translation?

Revealingly, one of the key metrics that eBay engineers are using to measure the machine translation success is whether a product is sold or not, according to a recent Wired report.

They also ask consumers to rate a translation’s quality. This is interesting because most of eBay’s content is user generated and therefore not normally of the highest quality.

No other eCommerce retailer is asking consumers these questions about their software translation quality. Translation algorithms typically get scored by how near a computer form gets to a human original.

Sawaf told Wired that eBay’s machine translations are measured by how consumers respond when undertaking real-life transactions and interactions.

He said eBay is optimizing towards consumer experience, behavior and actions, but not in the direction of automated scores, which, he says, are nearly always artificial.

Other examples of machine translation

Robot-generated content

Books and newspaper articles are already being written by robots.

France-based business school professor Philip Barker now has more than 100,000 Amazon book titles to his name. Each one has been generated by software and they cover subjects as diverse as Romanian crossword guides and fromage frais fat content.

These could have publishing account directors salivating, as they take less than 60 minutes to “write.” But none of the titles could be expected to win the Man Booker literary prize.

It’s a similar story with machine-generated press articles. Programer/journalist Ken Schwencke has developed an algorithm which automatically creates a short story whenever an earthquake happens.

In March Schwencke’s “robot” wrote the breaking news on a Los Angeles earthquake, enabling the LA Times to ‘scoop’ its rivals.
Again, it hardly screams “read me!” and the Pulitzer journalism award can wait.

But these templates for writing software do beg the question: could they one day be used to translate content? And, if so, would they throw up similar “factory-produced” homogenized content?

Perhaps the last word should go to a professional translator. The work of Rasika Gumaste, an employee in the growing Indian translation hotbed of Pune, is nearly 100% computer-based. But, as he told the Times of India, human translators can score over software as only they can convey true meaning and context.

Exit mobile version