Lilt Labs

Learn and explore everything you need to know about global experience

CEO Spence Green's 2022 Look Back and 2023 Look Ahead

3

2022 reminded us how uncertain the world can be, and how quickly that uncertainty renders obsolete the best-laid plans. In startups, resilience and adaptability are rewarded. We would like to thank our translators, customers, and partners for helping us make phenomenal advances toward building multilingually-inclusive global customer experiences.

Announcing Lilt's Series C Financing

4

Mission > GX In ordinary times, a company milestone announcement would begin with a proclamation of progress and achievement. But these are extraordinary times, and as a society we face an unprecedented array of challenges: the lasting effects of a global pandemic, a ground war in Europe, rising inflation, labor and energy shortages, and fractious politics in most of the world. Easing this societal and economic turmoil is most important and should be our focus. Lilt has a role to play in addressing some of these challenges.

Statement on Ukraine

1

We at Lilt are saddened to witness the events that have unfolded in Ukraine over the past weeks, and we unequivocally condemn Russia’s acts of aggression. To our Ukrainian colleagues, our Ukrainian translator community, and all those with ties to the country, our support for you is unconditional.

Introducing Ascend 2020, Lilt's Annual Localization Conference

3

Announcing Lilt's Series B Financing

6

The languages one learns as a child will influence nearly every aspect of their life: their community, their access to information, and even their career prospects. I observed this most acutely while living in the Middle East, where I met bright and ambitious people who were often cut off from intellectual work because they didn't speak English. While the capacity for language is shared by all of humanity and one of the most fascinating aspects of human intelligence, language differences can divide us socially and economically.

Scaling Localization With Artificial Intelligence and Automation

3

A few weeks ago, our friends over at GALA jointly hosted a webinar with our CEO Spence Green called "Scaling Localization With Artificial Intelligence and Automation."

Lilt Adds 40th Language!

3

Lilt is happy to announce the addition of Bulgarian and Slovenian to our platform today. With this, Lilt officially supports 40 languages; or 50 including variations of languages such as Castilian and Latin American Spanish.

Announcing Lilt's Series A Financing

8

Today I’m pleased to announce that we raised $9.5M in new funding led by Sequoia Capital. Bill Coughran, partner at Sequoia, will join our board. Our existing investors‒Redpoint Ventures, Zetta Venture Partners, and XSeed Capital‒all participated in the round. Series A funding indicates two milestones in an enterprise company’s life: strong revenue and momentum, and a compelling plan for the deployment of new capital. It also marks the start of a new partnership, in this case among us, Bill, and Sequoia. We are thrilled because Bill is that unique leader who has contributed to science, managed large technical teams, and led businesses. Sequoia has also funded the businesses we admire most, among them Google, Apple, and Stripe. In this post, I’ll describe what we’ve achieved, what we plan to do, and why we are certain that Bill is the right partner to help us do it. How We Got Here

What We’re Reading: Domain Attention with an Ensemble of Experts

1

A major problem in effective deployment of machine learning systems in practice is domain adaptation — given a large auxiliary supervised dataset and a smaller dataset of interest, using the auxiliary dataset to increase performance on the smaller dataset. This paper considers the case where we have K datasets from distinct domains and adapting quickly to a new dataset. It learns K separate models on each of the K datasets and treats each as experts. Then given a new domain it creates another model for this domain, but in addition, computes attention over the experts. It computes attention via a dot product that computes the similarity of the new domain’s hidden representation with the other K domains’ representations.