{"id":36213,"date":"2024-11-01T09:46:42","date_gmt":"2024-11-01T09:46:42","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36213"},"modified":"2024-11-01T09:46:42","modified_gmt":"2024-11-01T09:46:42","slug":"the-hugging-face-transformers-practical-course-encoding-and-decoding","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36213\/","title":{"rendered":"The Hugging Face Transformers Practical Course, Encoding and Decoding"},"content":{"rendered":"<p><body><\/p>\n<p>In the field of deep learning, natural language processing (NLP) is one of the areas that has received particular attention. The <a href=\"https:\/\/github.com\/huggingface\/transformers\">Transformers library by Hugging Face<\/a>, released in 2018, is a powerful tool that helps easily use NLP models. This course will cover how to perform encoding and decoding using the Hugging Face Transformers library.<\/p>\n<h2>1. Introduction to the Transformers Library<\/h2>\n<p>The Transformers library supports various neural network architectures such as BERT, GPT-2, and T5. With this library, complex NLP models can be implemented easily, and it is utilized in both personal research and commercial projects.<\/p>\n<h3>1.1 Installation<\/h3>\n<p>To install the Transformers library, use pip. Please run the following command.<\/p>\n<pre><code>pip install transformers<\/code><\/pre>\n<h2>2. Text Encoding<\/h2>\n<p>Encoding is the process of converting text data into a format that the model can understand. The Transformers library uses a tokenizer to encode text. Here&#8217;s an example of encoding text using the BERT model&#8217;s tokenizer.<\/p>\n<h3>2.1 BERT Tokenizer Example<\/h3>\n<p>The code below shows the process of encoding an input sentence using the BERT model&#8217;s basic tokenizer.<\/p>\n<pre><code>from transformers import BertTokenizer\n\n# Initialize the BERT model's tokenizer\ntokenizer = BertTokenizer.from_pretrained('bert-base-uncased')\n\n# Text to be encoded\ntext = \"Hello, how are you?\"\n\n# Text encoding\nencoded_input = tokenizer(text, return_tensors='pt')\n\n# Print the results\nprint(encoded_input)<\/code><\/pre>\n<p>In the above code, the <code>BertTokenizer.from_pretrained()<\/code> method is used to load the pre-trained BERT tokenizer. Then, the <code>tokenizer()<\/code> method encodes the input sentence. The <code>return_tensors='pt'<\/code> returns a PyTorch tensor instead of a TensorFlow one.<\/p>\n<h3>2.2 Explanation of Encoding Results<\/h3>\n<p>The encoding results have the following structure:<\/p>\n<ul>\n<li><strong>input_ids:<\/strong> A list of numbers encoding each word.<\/li>\n<li><strong>token_type_ids:<\/strong> A list of IDs for differentiating sentences.<\/li>\n<li><strong>attention_mask:<\/strong> A mask representing actual tokens excluding padding.<\/li>\n<\/ul>\n<h3>2.3 Output of Encoding Results<\/h3>\n<pre><code>input_ids = encoded_input['input_ids']\ntoken_type_ids = encoded_input['token_type_ids']\nattention_mask = encoded_input['attention_mask']\n\nprint(\"Input IDs:\", input_ids)\nprint(\"Token Type IDs:\", token_type_ids)\nprint(\"Attention Mask:\", attention_mask)<\/code><\/pre>\n<p>By printing the encoding results, you can check the contents of each list. This provides the information needed for the model to process the input.<\/p>\n<h2>3. Text Decoding<\/h2>\n<p>Decoding is the process of transforming the model&#8217;s output into a format that humans can understand. The Hugging Face Transformers library also allows for simple decoding functionality.<\/p>\n<h3>3.1 Simple Decoding Example<\/h3>\n<p>The code below demonstrates the process of decoding the model&#8217;s prediction results.<\/p>\n<pre><code>from transformers import BertForSequenceClassification\nimport torch\n\n# Load the BERT model\nmodel = BertForSequenceClassification.from_pretrained('bert-base-uncased')\n\n# Run the model for predictions\nwith torch.no_grad():\n    outputs = model(**encoded_input)\n\n# Extract logits from the results\nlogits = outputs.logits\n\n# Convert logits to probabilities\nprobabilities = torch.nn.functional.softmax(logits, dim=-1)\n\n# Perform decoding\npredicted_class = probabilities.argmax().item()\nprint(\"Predicted Class:\", predicted_class)<\/code><\/pre>\n<p>In the code above, the BERT model is used to make predictions based on encoded inputs. The obtained logits are converted to probability values using the softmax function, and the class with the highest probability is predicted.<\/p>\n<h3>3.2 Multi-class Classification<\/h3>\n<p>Multi-class classification problems occur frequently in natural language processing. Below are descriptions of multi-class classification metrics.<\/p>\n<ul>\n<li><strong>Accuracy:<\/strong> The ratio of samples classified correctly.<\/li>\n<li><strong>Precision:<\/strong> The ratio of actual positives among predicted positives.<\/li>\n<li><strong>Recall:<\/strong> The ratio of predicted positives among actual positives.<\/li>\n<li><strong>F1 Score:<\/strong> The harmonic mean of precision and recall.<\/li>\n<\/ul>\n<p>These metrics are useful for evaluating the effectiveness of the model.<\/p>\n<h2>4. Conclusion<\/h2>\n<p>We learned how to easily encode and decode NLP models using the Transformers library. Through the examples provided in this course, you can perform various tasks using models. I hope it will be helpful for your future research or projects.<\/p>\n<h2>5. References<\/h2>\n<ul>\n<li><a href=\"https:\/\/huggingface.co\/docs\/transformers\/index\">Hugging Face Transformers Documentation<\/a><\/li>\n<li><a href=\"https:\/\/arxiv.org\/abs\/1810.04805\">BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding<\/a><\/li>\n<li><a href=\"https:\/\/github.com\/huggingface\/transformers\">Hugging Face GitHub Repository<\/a><\/li>\n<\/ul>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the field of deep learning, natural language processing (NLP) is one of the areas that has received particular attention. The Transformers library by Hugging Face, released in 2018, is a powerful tool that helps easily use NLP models. This course will cover how to perform encoding and decoding using the Hugging Face Transformers library. &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36213\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;The Hugging Face Transformers Practical Course, Encoding and Decoding&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[108],"tags":[],"class_list":["post-36213","post","type-post","status-publish","format-standard","hentry","category---en"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36213\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"In the field of deep learning, natural language processing (NLP) is one of the areas that has received particular attention. The Transformers library by Hugging Face, released in 2018, is a powerful tool that helps easily use NLP models. This course will cover how to perform encoding and decoding using the Hugging Face Transformers library. &hellip; \ub354 \ubcf4\uae30 &quot;The Hugging Face Transformers Practical Course, Encoding and Decoding&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36213\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:46:42+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"3\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"The Hugging Face Transformers Practical Course, Encoding and Decoding\",\"datePublished\":\"2024-11-01T09:46:42+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/\"},\"wordCount\":481,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"Using Hugging Face\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36213\/\",\"name\":\"The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:46:42+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36213\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36213\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The Hugging Face Transformers Practical Course, Encoding and Decoding\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36213\/","og_locale":"ko_KR","og_type":"article","og_title":"The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"In the field of deep learning, natural language processing (NLP) is one of the areas that has received particular attention. The Transformers library by Hugging Face, released in 2018, is a powerful tool that helps easily use NLP models. This course will cover how to perform encoding and decoding using the Hugging Face Transformers library. &hellip; \ub354 \ubcf4\uae30 \"The Hugging Face Transformers Practical Course, Encoding and Decoding\"","og_url":"https:\/\/atmokpo.com\/w\/36213\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:46:42+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"3\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36213\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36213\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"The Hugging Face Transformers Practical Course, Encoding and Decoding","datePublished":"2024-11-01T09:46:42+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36213\/"},"wordCount":481,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["Using Hugging Face"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36213\/","url":"https:\/\/atmokpo.com\/w\/36213\/","name":"The Hugging Face Transformers Practical Course, Encoding and Decoding - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:46:42+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36213\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36213\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36213\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"The Hugging Face Transformers Practical Course, Encoding and Decoding"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36213","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36213"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36213\/revisions"}],"predecessor-version":[{"id":36214,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36213\/revisions\/36214"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36213"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36213"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36213"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}