{"id":36151,"date":"2024-11-01T09:46:09","date_gmt":"2024-11-01T09:46:09","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36151"},"modified":"2024-11-01T09:46:09","modified_gmt":"2024-11-01T09:46:09","slug":"using-hugging-face-transformers-for-m2m100-chinese-english-automatic-translation","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36151\/","title":{"rendered":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation"},"content":{"rendered":"<p><body><\/p>\n<p>\n    Recently, with the advancement of artificial intelligence, significant progress has been made in the field of natural language processing. Among them,<br \/>\n    <strong>Hugging Face<\/strong>\u2019s <strong>Transformers<\/strong> library has established itself as a tool that helps easily utilize various language models. In this course, we will<br \/>\n    explain in detail how to implement automatic translation between Chinese and English using the <strong>M2M100<\/strong> model with Hugging Face.\n<\/p>\n<h2>1. Introduction to the M2M100 Model<\/h2>\n<p>\n    M2M100 is a model for multilingual translation that supports direct conversion between multiple languages. This model excels in handling &#8216;various<br \/>\n    languages&#8217; and supports over 100 languages, offering the advantage of performing direct translations without going through an intermediate language, unlike traditional translation systems.\n<\/p>\n<h2>2. Installation and Setup<\/h2>\n<p>\n    To use the M2M100 model, you first need to install the <strong>Hugging Face Transformers<\/strong> library and related dependencies. You can install it using the<br \/>\n    <code>pip<\/code> command as shown below.\n<\/p>\n<pre><code>pip install transformers torch<\/code><\/pre>\n<h2>3. Loading the Model and Implementing the Translation Function<\/h2>\n<p>\n    To use the model, you must first load the M2M100 model. The following code is an example of loading the model and tokenizer and implementing a simple function for translation.\n<\/p>\n<pre><code>\nfrom transformers import M2M100ForConditionalGeneration, M2M100Tokenizer\n\n# Load model and tokenizer\nmodel_name = \"facebook\/m2m100_418M\"\ntokenizer = M2M100Tokenizer.from_pretrained(model_name)\nmodel = M2M100ForConditionalGeneration.from_pretrained(model_name)\n\ndef translate(text, source_language, target_language):\n    tokenizer.src_lang = source_language\n    encoded_input = tokenizer(text, return_tensors=\"pt\")\n    generated_tokens = model.generate(**encoded_input, forced_bos_token_id=tokenizer.get_lang_id(target_language))\n    return tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)[0]\n<\/code><\/pre>\n<h3>3.1 Explanation of the Translation Function<\/h3>\n<p>\n    The code above works as follows:<\/p>\n<ol>\n<li><code>tokenizer.src_lang<\/code>: Sets the source language.<\/li>\n<li><code>tokenizer()<\/code>: Tokenizes the input text.<\/li>\n<li><code>model.generate()<\/code>: Performs translation based on the tokenized input.<\/li>\n<li><code>tokenizer.batch_decode()<\/code>: Decodes the generated tokens and returns the translated text.<\/li>\n<\/ol>\n<h2>4. Translation Examples<\/h2>\n<p>\n    Now, let&#8217;s test the translation functionality. The example below demonstrates translating a Chinese sentence into English.\n<\/p>\n<pre><code>\n# Sentence to be translated\ntext = \"\u4f60\u597d\uff0c\u4e16\u754c\uff01\"  # Hello, World!\nsource_lang = \"zh\"  # Chinese\ntarget_lang = \"en\"  # English\n\n# Perform translation\ntranslated_text = translate(text, source_lang, target_lang)\nprint(f\"Translation result: {translated_text}\")\n<\/code><\/pre>\n<h3>4.1 Interpretation of the Results<\/h3>\n<p>\n    When the above code is executed, the output will be the English sentence &#8220;Hello, World!&#8221;. The M2M100 model can effectively translate even languages<br \/>\n    with relatively complex sentence structures.\n<\/p>\n<h2>5. Multilingual Translation Examples<\/h2>\n<p>\n    One of the powerful features of the M2M100 model is its support for multiple languages. The example below performs translation between various languages<br \/>\n    including Korean, French, and Spanish.\n<\/p>\n<pre><code>\n# Multilingual translation test\nsamples = [\n    {\"text\": \"\uc5ec\ub7ec \uc5b8\uc5b4\ub97c \uc9c0\uc6d0\ud558\ub294 \ubaa8\ub378\uc785\ub2c8\ub2e4.\", \"source\": \"ko\", \"target\": \"en\"},  # Korean to English\n    {\"text\": \"Bonjour le monde!\", \"source\": \"fr\", \"target\": \"ko\"},  # French to Korean\n    {\"text\": \"\u00a1Hola Mundo!\", \"source\": \"es\", \"target\": \"ja\"},  # Spanish to Japanese\n]\n\nfor sample in samples:\n    translated = translate(sample[\"text\"], sample[\"source\"], sample[\"target\"])\n    print(f\"{sample['text']} ({sample['source']}) -> {translated} ({sample['target']})\")\n<\/code><\/pre>\n<h3>5.1 Multilingual Translation Results<\/h3>\n<p>\n    Running the code above will output translations between several languages. The important point is that the M2M100 model can translate various languages<br \/>\n    directly without going through an intermediate language.\n<\/p>\n<h2>6. Performance Evaluation<\/h2>\n<p>\n    To evaluate the quality of translations, the BLEU (Bilingual Evaluation Understudy) score can be used. The BLEU score quantitatively measures the<br \/>\n    similarity between the generated translation and the reference translation. The following is the process to calculate the BLEU score.\n<\/p>\n<pre><code>\nfrom nltk.translate.bleu_score import sentence_bleu\n\n# Reference translation and system translation\nreference = [\"Hello\", \"World\"]\ncandidate = translated_text.split()\n\n# Calculate BLEU score\nbleu_score = sentence_bleu([reference], candidate)\nprint(f\"BLEU score: {bleu_score:.4f}\")\n<\/code><\/pre>\n<h3>6.1 Interpretation of Performance Evaluation<\/h3>\n<p>\n    A BLEU score close to 0 indicates poor translation, while a score close to 1 indicates high quality of translation.<br \/>\n    Various examples and reference translations can be used to evaluate the translation performance across multiple languages.\n<\/p>\n<h2>7. Conclusion<\/h2>\n<p>\n    Hugging Face&#8217;s M2M100 model is a model that has achieved innovative advancements in the field of multilingual translation.<br \/>\n    In this course, we explored a basic example of automatic translation between Chinese and English using the M2M100 model. This model is capable of direct language conversion, allowing translations between various languages without an intermediate language.\n<\/p>\n<p>\n    In the future, try experimenting with more languages and complex sentences to further improve this model&#8217;s performance and find ways to leverage it. The Hugging Face Transformers library can be applied to various NLP tasks, so feel free to apply it to different projects.\n<\/p>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Recently, with the advancement of artificial intelligence, significant progress has been made in the field of natural language processing. Among them, Hugging Face\u2019s Transformers library has established itself as a tool that helps easily utilize various language models. In this course, we will explain in detail how to implement automatic translation between Chinese and English &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36151\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[108],"tags":[],"class_list":["post-36151","post","type-post","status-publish","format-standard","hentry","category---en"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36151\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"Recently, with the advancement of artificial intelligence, significant progress has been made in the field of natural language processing. Among them, Hugging Face\u2019s Transformers library has established itself as a tool that helps easily utilize various language models. In this course, we will explain in detail how to implement automatic translation between Chinese and English &hellip; \ub354 \ubcf4\uae30 &quot;Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36151\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:46:09+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"4\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation\",\"datePublished\":\"2024-11-01T09:46:09+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/\"},\"wordCount\":529,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"Using Hugging Face\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36151\/\",\"name\":\"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:46:09+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36151\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36151\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36151\/","og_locale":"ko_KR","og_type":"article","og_title":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"Recently, with the advancement of artificial intelligence, significant progress has been made in the field of natural language processing. Among them, Hugging Face\u2019s Transformers library has established itself as a tool that helps easily utilize various language models. In this course, we will explain in detail how to implement automatic translation between Chinese and English &hellip; \ub354 \ubcf4\uae30 \"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation\"","og_url":"https:\/\/atmokpo.com\/w\/36151\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:46:09+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"4\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36151\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36151\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation","datePublished":"2024-11-01T09:46:09+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36151\/"},"wordCount":529,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["Using Hugging Face"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36151\/","url":"https:\/\/atmokpo.com\/w\/36151\/","name":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:46:09+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36151\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36151\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36151\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Using Hugging Face Transformers for M2M100 Chinese-English Automatic Translation"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36151","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36151"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36151\/revisions"}],"predecessor-version":[{"id":36152,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36151\/revisions\/36152"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36151"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36151"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36151"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}