{"version":"1.0","provider_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","provider_url":"https:\/\/atmokpo.com\/w","author_name":"root","author_url":"https:\/\/atmokpo.com\/w\/author\/root\/","title":"Deep Learning for Natural Language Processing: Sentence BERT (SBERT) - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"SKKXfQezj7\"><a href=\"https:\/\/atmokpo.com\/w\/32379\/\">Deep Learning for Natural Language Processing: Sentence BERT (SBERT)<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"https:\/\/atmokpo.com\/w\/32379\/embed\/#?secret=SKKXfQezj7\" width=\"600\" height=\"338\" title=\"&#8220;Deep Learning for Natural Language Processing: Sentence BERT (SBERT)&#8221; &#8212; \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" data-secret=\"SKKXfQezj7\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script>\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n\/\/# sourceURL=https:\/\/atmokpo.com\/w\/wp-includes\/js\/wp-embed.min.js\n<\/script>\n","description":"In recent years, Natural Language Processing (NLP) has rapidly advanced thanks to the development of deep learning technologies. In particular, the BERT (Bidirectional Encoder Representations from Transformers) model has demonstrated groundbreaking achievements in the field of natural language understanding, proving its performance in various NLP tasks. However, BERT can be inefficient for tasks that require &hellip; \ub354 \ubcf4\uae30 \"\""}