{"id":36117,"date":"2024-11-01T09:45:53","date_gmt":"2024-11-01T09:45:53","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36117"},"modified":"2024-11-01T09:45:53","modified_gmt":"2024-11-01T09:45:53","slug":"hugging-face-transformers-tutorial-dialogpt-environment-setup","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36117\/","title":{"rendered":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup"},"content":{"rendered":"<p><body><\/p>\n<p>The recent advancements in deep learning technology have brought innovation to the field of Natural Language Processing (NLP). In particular, the Transformers library provided by Hugging Face allows developers and researchers to easily utilize various pre-trained models, making it very popular. Among them, <strong>DialoGPT<\/strong> is a prominent example of a conversational model, incredibly useful for generating natural and appropriate responses in conversations with users.<\/p>\n<h2>1. What is DialoGPT?<\/h2>\n<p>DialoGPT is a conversational AI model developed by Microsoft, based on the GPT-2 architecture. This model has been trained on a large amount of conversational data and is skilled in understanding the context of conversations and generating coherent statements. Essentially, DialoGPT has the following features:<\/p>\n<ul>\n<li>Natural conversation generation: Generates relevant responses to user inputs.<\/li>\n<li>Handling a variety of topics: Can engage in conversations on various topics and generate context-appropriate answers.<\/li>\n<li>Improving user experience: Has features that can provide a human-like feel during interaction with users.<\/li>\n<\/ul>\n<h2>2. Environment Setup<\/h2>\n<p>Now, let&#8217;s set up the environment to use DialoGPT. You can proceed by following the steps below.<\/p>\n<h3>2.1 Install Python and Packages<\/h3>\n<p>Before getting started, make sure you have Python installed. If it is not installed, you can download it from the official Python website. Additionally, you need to install the required packages. You can use the <code>pip<\/code> command for this.<\/p>\n<pre><code>pip install transformers torch<\/code><\/pre>\n<h3>2.2 Writing the Code<\/h3>\n<p>Now let&#8217;s write some code to load the DialoGPT model and have a simple conversation. The code below initializes DialoGPT and includes functionality to generate responses to user input.<\/p>\n<pre><code>from transformers import AutoModelForCausalLM, AutoTokenizer\nimport torch\n\n# Initialize model and tokenizer\nmodel_name = \"microsoft\/DialoGPT-medium\"\ntokenizer = AutoTokenizer.from_pretrained(model_name)\nmodel = AutoModelForCausalLM.from_pretrained(model_name)\n\n# List to store the state of the conversation\nchat_history_ids = None\n\n# Start conversation in an infinite loop\nwhile True:\n    user_input = input(\"User: \")\n    \n    # Tokenize user input\n    new_user_input_ids = tokenizer.encode(user_input + tokenizer.eos_token, return_tensors='pt')\n    \n    # Update conversation history\n    if chat_history_ids is not None:\n        chat_history_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1)\n    else:\n        chat_history_ids = new_user_input_ids\n\n    # Generate response from the model\n    response_ids = model.generate(chat_history_ids, max_length=1000, pad_token_id=tokenizer.eos_token_id)\n    \n    # Decode and print the generated response\n    bot_output = tokenizer.decode(response_ids[:, chat_history_ids.shape[-1]:][0], skip_special_tokens=True)\n    \n    print(f\"Model: {bot_output}\")\n<\/code><\/pre>\n<h3>2.3 Code Explanation<\/h3>\n<p>The explanation for the example code above is as follows:<\/p>\n<ul>\n<li>Import <code>AutoModelForCausalLM<\/code> and <code>AutoTokenizer<\/code> to prepare the model and tokenizer for use.<\/li>\n<li>Store the name of the DialoGPT model in the <code>model_name<\/code> variable. Here, we use the medium-sized model <strong>DialoGPT-medium<\/strong>.<\/li>\n<li>Use the <code>tokenizer.encode<\/code> method to tokenize user input and convert it into tensors.<\/li>\n<li>Call the model&#8217;s <code>generate<\/code> method to produce a response considering the context of the conversation.<\/li>\n<li>Use the <code>tokenizer.decode<\/code> method to decode and print the generated response.<\/li>\n<\/ul>\n<h2>3. Additional Settings and Utilization<\/h2>\n<p>While using the DialoGPT model, there are several additional settings you can consider to achieve better results. For example, efficiently managing user input to maintain the conversation context or adjusting the length of model responses are some methods.<\/p>\n<h3>3.1 Managing Conversation History<\/h3>\n<p>To keep the flow of the conversation natural, it is advisable to utilize the <code>chat_history_ids<\/code> storage to record all user inputs and model responses. This helps the model understand the previous context of the conversation.<\/p>\n<h3>3.2 Adjustable Parameters<\/h3>\n<p>You can adjust parameters like <code>max_length<\/code> to control the length and generation speed of responses during conversation generation. For example, adjusting the <code>temperature<\/code> parameter can increase the diversity of the generated responses:<\/p>\n<pre><code>response_ids = model.generate(chat_history_ids, max_length=1000, temperature=0.7, pad_token_id=tokenizer.eos_token_id)<\/code><\/pre>\n<h2>4. Conclusion<\/h2>\n<p>In this tutorial, we learned how to set up the environment for the DialoGPT model using the Hugging Face Transformers library. DialoGPT is a powerful tool for building conversational AI services quickly and easily. Furthermore, by mastering various parameter adjustments and model utilization methods, you can develop more advanced conversational AI systems.<\/p>\n<h3>5. References<\/h3>\n<ul>\n<li><a href=\"https:\/\/huggingface.co\/transformers\/model_doc\/dialogpt.html\">Hugging Face DialoGPT Documentation<\/a><\/li>\n<li><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/project\/dialogpt\/\">Microsoft DialoGPT Page<\/a><\/li>\n<li><a href=\"https:\/\/huggingface.co\/docs\/transformers\/index\">Hugging Face Transformers Documentation<\/a><\/li>\n<\/ul>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The recent advancements in deep learning technology have brought innovation to the field of Natural Language Processing (NLP). In particular, the Transformers library provided by Hugging Face allows developers and researchers to easily utilize various pre-trained models, making it very popular. Among them, DialoGPT is a prominent example of a conversational model, incredibly useful for &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36117\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Hugging Face Transformers Tutorial, DialoGPT Environment Setup&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[108],"tags":[],"class_list":["post-36117","post","type-post","status-publish","format-standard","hentry","category---en"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36117\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"The recent advancements in deep learning technology have brought innovation to the field of Natural Language Processing (NLP). In particular, the Transformers library provided by Hugging Face allows developers and researchers to easily utilize various pre-trained models, making it very popular. Among them, DialoGPT is a prominent example of a conversational model, incredibly useful for &hellip; \ub354 \ubcf4\uae30 &quot;Hugging Face Transformers Tutorial, DialoGPT Environment Setup&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36117\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:45:53+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"3\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Hugging Face Transformers Tutorial, DialoGPT Environment Setup\",\"datePublished\":\"2024-11-01T09:45:53+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/\"},\"wordCount\":506,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"Using Hugging Face\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36117\/\",\"name\":\"Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:45:53+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36117\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36117\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Hugging Face Transformers Tutorial, DialoGPT Environment Setup\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36117\/","og_locale":"ko_KR","og_type":"article","og_title":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"The recent advancements in deep learning technology have brought innovation to the field of Natural Language Processing (NLP). In particular, the Transformers library provided by Hugging Face allows developers and researchers to easily utilize various pre-trained models, making it very popular. Among them, DialoGPT is a prominent example of a conversational model, incredibly useful for &hellip; \ub354 \ubcf4\uae30 \"Hugging Face Transformers Tutorial, DialoGPT Environment Setup\"","og_url":"https:\/\/atmokpo.com\/w\/36117\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:45:53+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"3\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36117\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36117\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup","datePublished":"2024-11-01T09:45:53+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36117\/"},"wordCount":506,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["Using Hugging Face"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36117\/","url":"https:\/\/atmokpo.com\/w\/36117\/","name":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:45:53+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36117\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36117\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36117\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Hugging Face Transformers Tutorial, DialoGPT Environment Setup"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36117","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36117"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36117\/revisions"}],"predecessor-version":[{"id":36118,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36117\/revisions\/36118"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36117"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36117"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36117"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}