APPLE

Apple Research solves the English accent of artificial intelligence

Ask any non-radical speaker in English, and they will probably tell you that LLM is much better in the language of Shakespeare.

, sometimes in this difference. Sometimes not so much. Sometimes it is completely dangerous, as shown in this study by the 2023 Carnegie Mellon, which found that non -English input data can be easier to avoid safety filters. Class = “WP-BLOCK-QUOTE IS-LAYOUT-FLOW WP-BLOCK-BLOCK-LAYOUOUT-FLOW”>

The current large language models are mainly developed with the English language as the main language, and even the few that are multi-language, tend to demonstrate strong English displacements. Languages ​​reflecting English oriented patterns both in vocabulary and grammar.

In other words, even when models generate Chinese or French, they still “think” in English. Result? Non-English results still follow the English-like grammatical and vocabulary patterns. Will there be a native speaker? Even the developed Chinese QWEN model, underestimated in all languages, including the Chinese. Meta ' s llama 3.1 was the most natural in general, but still far lagged behind the exit at the human level.

Leave a Reply