{"id":36603,"date":"2024-11-01T09:49:56","date_gmt":"2024-11-01T09:49:56","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36603"},"modified":"2024-11-01T11:52:32","modified_gmt":"2024-11-01T11:52:32","slug":"deep-learning-pytorch-course-bidirectional-rnn-structure","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36603\/","title":{"rendered":"Deep Learning PyTorch Course, Bidirectional RNN Structure"},"content":{"rendered":"<p><body><\/p>\n<p>The advancement of deep learning technology is increasing the demand for processing sequence data. RNN (Recurrent Neural Network) is one of the representative structures for processing such sequence data. In this article, we will take a closer look at the concept of Bidirectional RNN (Bi-directional RNN) and how to implement it using PyTorch.<\/p>\n<h2>1. Understanding RNN (Recurrent Neural Network)<\/h2>\n<p>RNN is a neural network with a cyclic structure that has the ability to process sequence data (e.g., text, time series). While conventional neural networks receive input once and produce output, RNN remembers previous states and uses them to update the current state. This enables RNN to learn the temporal dependencies of sequences.<\/p>\n<h3>1.1. Basic Structure of RNN<\/h3>\n<p>The basic structure of RNN is similar to that of a basic neuron, but it has a structure that connects repeatedly over time. Below is a representation of the information flow of a single RNN cell:<\/p>\n<pre>\n     h(t-1)\n      |\n      v\n     (W_hh)\n      |\n     +---------+\n     |         |\n    input --&gt; (tanh) --&gt; h(t)\n     |         |\n     +---------+\n<\/pre>\n<p>In this structure, <code>h(t-1)<\/code> is the hidden state from the previous time step, and this value is used to calculate the current hidden state <code>h(t)<\/code>. Here, the weight <code>W_hh<\/code> plays a role in transforming the previous hidden state to the current hidden state.<\/p>\n<h3>1.2. Limitations of RNN<\/h3>\n<p>RNN faces the problem of &#8220;memory limitations&#8221; when processing long sequences. In particular, the initial input information can be lost in long sequences. To address this, structures such as LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Unit) have been developed.<\/p>\n<h2>2. Bidirectional RNN (Bi-directional RNN)<\/h2>\n<p>Bidirectional RNN is a structure that can process sequences in two directions. This means that it can obtain information from both the past (forward) and the future (backward). This structure operates as follows.<\/p>\n<h3>2.1. Basic Idea of Bidirectional RNN<\/h3>\n<p>Bidirectional RNN uses two RNN layers. One layer processes the input sequence in a forward direction, while the other layer processes the input sequence in a backward direction. Below is a simple illustration of the structure of Bidirectional RNN:<\/p>\n<pre>\n  Forward     Backward\n   RNN         RNN\n     |           |\n    h(t-1)   h(t+1)\n       \\    +--&gt; (merge) --&gt; h(t)\n        \\   |\n         h(t)\n<\/pre>\n<p>Both the forward RNN and backward RNN process the input simultaneously, and these two hidden states are combined to create the final output. By doing so, RNN can more effectively utilize all the information of the sequence.<\/p>\n<h2>3. Implementing Bidirectional RNN with PyTorch<\/h2>\n<p>Now, let&#8217;s implement a Bidirectional RNN using PyTorch. In this example, we will use a random sequence as data and create a model to predict the next character using the Bidirectional RNN.<\/p>\n<h3>3.1. Importing Required Libraries<\/h3>\n<pre><code>python\nimport torch\nimport torch.nn as nn\nimport torch.optim as optim\nimport numpy as np\n<\/code><\/pre>\n<h3>3.2. Preparing the Data<\/h3>\n<p>The input data will be a simple string, and we will predict the next character of this string. The string data will be transformed into a sequence of characters that appear consecutively. Below is a simple data preparation code:<\/p>\n<pre><code>python\n# Setting data and character set\ndata = \"hello deep learning with pytorch\"\nchars = sorted(list(set(data)))\nchar_to_index = {ch: ix for ix, ch in enumerate(chars)}\nindex_to_char = {ix: ch for ix, ch in enumerate(chars)}\n\n# Hyperparameters\nseq_length = 5\ninput_size = len(chars)\nhidden_size = 128\nnum_layers = 2\noutput_size = len(chars)\n\n# Creating dataset\ninputs = []\ntargets = []\nfor i in range(len(data) - seq_length):\n    inputs.append([char_to_index[ch] for ch in data[i:i + seq_length]])\n    targets.append(char_to_index[data[i + seq_length]])\n\ninputs = np.array(inputs)\ntargets = np.array(targets)\n<\/code><\/pre>\n<h3>3.3. Defining the Bidirectional RNN Model<\/h3>\n<p>Now, let&#8217;s define the Bidirectional RNN model. In PyTorch, we can create RNN layers using <code>nn.RNN()<\/code> or <code>nn.LSTM()<\/code>. Here, we will use <code>nn.RNN()<\/code>:<\/p>\n<pre><code>python\nclass BiRNN(nn.Module):\n    def __init__(self, input_size, hidden_size, output_size, num_layers):\n        super(BiRNN, self).__init__()\n        self.hidden_size = hidden_size\n        self.num_layers = num_layers\n        \n        # Bidirectional RNN layer\n        self.rnn = nn.RNN(input_size, hidden_size, num_layers, batch_first=True, bidirectional=True)\n        self.fc = nn.Linear(hidden_size * 2, output_size) # Considering both directions, hidden_size * 2\n        \n    def forward(self, x):\n        # Pass data through RNN\n        out, _ = self.rnn(x)\n        # Get the output of the last time step\n        out = out[:, -1, :]   \n        \n        # Generate the final output\n        out = self.fc(out)\n        return out\n<\/code><\/pre>\n<h3>3.4. Training the Model<\/h3>\n<p>Having defined the model, let&#8217;s implement the training process. We will use PyTorch&#8217;s <code>DataLoader<\/code> to support batch processing and <code>CrossEntropyLoss<\/code> as the loss function:<\/p>\n<pre><code>python\n# Setting hyperparameters\nnum_epochs = 200\nbatch_size = 10\nlearning_rate = 0.01\n\n# Initializing model, loss function, and optimizer\nmodel = BiRNN(input_size, hidden_size, output_size, num_layers)\ncriterion = nn.CrossEntropyLoss()\noptimizer = optim.Adam(model.parameters(), lr=learning_rate)\n\n# Training loop\nfor epoch in range(num_epochs):\n    # Convert data to tensor\n    x_batch = torch.tensor(inputs, dtype=torch.float32).view(-1, seq_length, input_size)\n    y_batch = torch.tensor(targets, dtype=torch.long)\n\n    # Zero gradients\n    model.zero_grad()\n\n    # Model prediction\n    outputs = model(x_batch)\n    \n    # Calculate loss\n    loss = criterion(outputs, y_batch)\n    \n    # Backpropagation and weight update\n    loss.backward()\n    optimizer.step()\n\n    if (epoch+1) % 20 == 0:\n        print(f'Epoch [{epoch+1}\/{num_epochs}], Loss: {loss.item():.4f}')\n<\/code><\/pre>\n<h3>3.5. Evaluating the Model<\/h3>\n<p>After training the model, we will evaluate it using test data and learn how to predict the next character for an input sequence:<\/p>\n<pre><code>python\ndef predict_next_char(model, input_seq):\n    model.eval()  # Switch to evaluation mode\n    with torch.no_grad():\n        input_tensor = torch.tensor([[char_to_index[ch] for ch in input_seq]], dtype=torch.float32)\n        input_tensor = input_tensor.view(-1, seq_length, input_size)\n        output = model(input_tensor)\n        _, predicted_index = torch.max(output, 1)\n    return index_to_char[predicted_index.item()]\n\n# Prediction test\ntest_seq = \"hello\"\npredicted_char = predict_next_char(model, test_seq)\nprint(f'Input sequence: {test_seq} Predicted next character: {predicted_char}')\n<\/code><\/pre>\n<h2>4. Conclusion<\/h2>\n<p>In this article, we thoroughly explored the concept of Bidirectional RNN and how to implement it using PyTorch. Bidirectional RNN is a powerful structure that can utilize information from both the past and the future, making it useful in various sequence data processing tasks such as natural language processing (NLP). This RNN structure can learn the patterns and dependencies of sequence data more effectively.<\/p>\n<p>We will continue to explore various deep learning techniques and architectures, and I hope this article will greatly assist you in your deep learning studies!<\/p>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The advancement of deep learning technology is increasing the demand for processing sequence data. RNN (Recurrent Neural Network) is one of the representative structures for processing such sequence data. In this article, we will take a closer look at the concept of Bidirectional RNN (Bi-directional RNN) and how to implement it using PyTorch. 1. Understanding &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36603\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Deep Learning PyTorch Course, Bidirectional RNN Structure&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[149],"tags":[],"class_list":["post-36603","post","type-post","status-publish","format-standard","hentry","category-pytorch-study"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36603\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"The advancement of deep learning technology is increasing the demand for processing sequence data. RNN (Recurrent Neural Network) is one of the representative structures for processing such sequence data. In this article, we will take a closer look at the concept of Bidirectional RNN (Bi-directional RNN) and how to implement it using PyTorch. 1. Understanding &hellip; \ub354 \ubcf4\uae30 &quot;Deep Learning PyTorch Course, Bidirectional RNN Structure&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36603\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:49:56+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-11-01T11:52:32+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"5\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Deep Learning PyTorch Course, Bidirectional RNN Structure\",\"datePublished\":\"2024-11-01T09:49:56+00:00\",\"dateModified\":\"2024-11-01T11:52:32+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/\"},\"wordCount\":613,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"PyTorch Study\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36603\/\",\"name\":\"Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:49:56+00:00\",\"dateModified\":\"2024-11-01T11:52:32+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36603\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36603\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Deep Learning PyTorch Course, Bidirectional RNN Structure\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36603\/","og_locale":"ko_KR","og_type":"article","og_title":"Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"The advancement of deep learning technology is increasing the demand for processing sequence data. RNN (Recurrent Neural Network) is one of the representative structures for processing such sequence data. In this article, we will take a closer look at the concept of Bidirectional RNN (Bi-directional RNN) and how to implement it using PyTorch. 1. Understanding &hellip; \ub354 \ubcf4\uae30 \"Deep Learning PyTorch Course, Bidirectional RNN Structure\"","og_url":"https:\/\/atmokpo.com\/w\/36603\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:49:56+00:00","article_modified_time":"2024-11-01T11:52:32+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"5\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36603\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36603\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Deep Learning PyTorch Course, Bidirectional RNN Structure","datePublished":"2024-11-01T09:49:56+00:00","dateModified":"2024-11-01T11:52:32+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36603\/"},"wordCount":613,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["PyTorch Study"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36603\/","url":"https:\/\/atmokpo.com\/w\/36603\/","name":"Deep Learning PyTorch Course, Bidirectional RNN Structure - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:49:56+00:00","dateModified":"2024-11-01T11:52:32+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36603\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36603\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36603\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Deep Learning PyTorch Course, Bidirectional RNN Structure"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36603","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36603"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36603\/revisions"}],"predecessor-version":[{"id":36604,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36603\/revisions\/36604"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36603"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36603"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36603"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}