Team Lilt Spotlight: Marina Lee

This week we’re chatting with Lilt team member, Marina Lee. Keep reading to learn more about Marina and don’t forget to say hello to her at ATA 58!

Read more...

Case Study: First Large-Scale Application of Auto-Adaptive MT

Combining Machine Translation (MT) with auto-adaptive Machine Learning (ML) enables a new paradigm of machine assistance. Such systems learn from the experience, intelligence and insights of their human users, improving productivity by working in partnership, making suggestions and improving accuracy over time.

The net […]

Read more...

Happy Translator's Day

Happy Translator’s Day, my fellow Translators and Interpreters!

On this day, I would like to recognize and commend fellow translators for the work we do and what it requires, and address any layperson’s misconception that a fluent bilingual may as well serve as a qualified translator. That this is not so may be so (painfully) […]

Read more...

Advanced Terminology Management in Lilt

Our new advanced termbase editor lets you manage terminology more effectively by keeping terms organized with meta information that you can customize.

Import terminology with meta fields or add your own fields. Your terms will appear in both the Lexicon and the Editor suggestions and help you increase consistency and quality.

Read more...

Keeping Your Data Secure in Lilt

In a world where data hacks and breaches seem to make front-page news more often than we’d like, a common question translators and businesses have about Lilt is usually: is my data safe?No need to worry. Lilt was built with that concern in mind. Read the answers below to some common questions about security in Lilt.

Is my […]

Read more...

Cognitive Processes of Interpreting and Translation

Ever wonder what happens in the process of translation/interpretation “under the hood?” Let’s look at the mode of interpretation first. Cognitive processes that take place in a simultaneous interpreter’s mind and brain are intense and all happening nearly at the same time. Neurons are firing in all directions, igniting […]

Read more...

What We’re Reading: Domain Attention with an Ensemble of Experts

A major problem in effective deployment of machine learning systems in practice is domain adaptation — given a large auxiliary supervised dataset and a smaller dataset of interest, using the auxiliary dataset to increase performance on the smaller dataset. This paper considers the case where we have K datasets from distinct […]

Read more...

Making the Most of Your First Project in Lilt

Lilt was designed to maximize translation productivity. So you’ll want to get started using quickly, rather than spending your time learning how to use it.

The interface and user experience differ from conventional CAT tools. Change is hard. We know. But we’ve designed the system with the goal of making you productive in […]

Read more...

What We’re Reading: Learning to Decode for Future Success

When doing beam search in sequence to sequence models, one explores next words in order of their likelihood. However, during decoding, there may be other constraints we have or objectives we wish to maximize. For example, sequence length, BLEU score, or mutual information between the target and source sentences. In order to […]

Read more...

FREE THE TRANSLATORS! How Adaptive MT turns post-editing janitors into cultural consultants

Originally posted on LinkedIn by Greg Rosner.

I saw the phrase “linguistic janitorial work” in this Deloitte whitepaper on “AI-augmented government, using cognitive technologies to redesign public sector work”, used to describe the drudgery of translation work that so many translators are required to do today through […]

Read more...