{"id":36455,"date":"2024-11-01T09:48:37","date_gmt":"2024-11-01T09:48:37","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36455"},"modified":"2024-11-01T11:53:07","modified_gmt":"2024-11-01T11:53:07","slug":"deep-learning-pytorch-course-lstm-cell-implementation","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36455\/","title":{"rendered":"Deep Learning PyTorch Course, LSTM Cell Implementation"},"content":{"rendered":"<p><body><\/p>\n<p>\n        Deep learning has received a lot of attention in recent years, and in particular, recurrent neural networks (RNNs) are very useful for processing sequences of data such as time series data or natural language processing (NLP).<br \/>\n        A type of RNN called Long Short-Term Memory (LSTM) networks is designed to address the long-term dependency problem of RNNs.<br \/>\n        LSTM cells have a structure that allows them to efficiently store and process information using internal states, input gates, forget gates, and output gates.<br \/>\n        In this lecture, we will explain how to implement an LSTM cell using PyTorch.\n    <\/p>\n<h2>1. Basic Concepts of LSTM<\/h2>\n<p>\n        To understand the structure of LSTM, let&#8217;s first look at the basic concepts of RNN. Traditional RNNs calculate the next hidden state based on the current input and the previous hidden state.<br \/>\n        However, this structure makes effective learning difficult due to the gradient vanishing problem with long sequence data.<br \/>\n        The LSTM cell solves this problem by passing information through multiple gates, enabling it to learn long-term patterns effectively.\n    <\/p>\n<h2>2. Structure of LSTM Cell<\/h2>\n<p>\n        LSTM has the following key components:<\/p>\n<ul>\n<li><strong>Cell State:<\/strong> It serves to store the long-term memory of the network, allowing the preservation of past information.<\/li>\n<li><strong>Input Gate:<\/strong> It determines how much of the current input will be reflected in the cell state.<\/li>\n<li><strong>Forget Gate:<\/strong> It decides how much of the previous cell state to forget.<\/li>\n<li><strong>Output Gate:<\/strong> It determines the output based on the current cell state.<\/li>\n<\/ul>\n<p>        Through this, LSTM can remove unnecessary information and retain important information, enabling efficient learning of patterns in time series data.\n    <\/p>\n<h2>3. Implementing LSTM Cell (PyTorch)<\/h2>\n<p>\n        We will use the basic library of PyTorch to implement LSTM. In the following example, we will implement the LSTM cell directly and show its application through a basic example.\n    <\/p>\n<h3>3.1 Implementing LSTM Cell<\/h3>\n<p>\n        The code below is an example of implementing an LSTM cell using PyTorch. This code implements the internal states and various gates of the LSTM.\n    <\/p>\n<pre><code>\nimport torch\nimport torch.nn as nn\n\nclass LSTMCell(nn.Module):\n    def __init__(self, input_size, hidden_size):\n        super(LSTMCell, self).__init__()\n        self.input_size = input_size\n        self.hidden_size = hidden_size\n        \n        # Gates weights initialization\n        self.Wf = nn.Linear(input_size + hidden_size, hidden_size)  # Forget gate\n        self.Wi = nn.Linear(input_size + hidden_size, hidden_size)  # Input gate\n        self.Wc = nn.Linear(input_size + hidden_size, hidden_size)  # Cell gate\n        self.Wo = nn.Linear(input_size + hidden_size, hidden_size)  # Output gate\n        \n    def forward(self, x, hidden):\n        h_prev, c_prev = hidden\n        \n        # Concatenate input with previous hidden state\n        combined = torch.cat((x, h_prev), 1)\n        \n        # Forget gate\n        f_t = torch.sigmoid(self.Wf(combined))\n        # Input gate\n        i_t = torch.sigmoid(self.Wi(combined))\n        # Cell gate\n        c_hat_t = torch.tanh(self.Wc(combined))\n        # Current cell state\n        c_t = f_t * c_prev + i_t * c_hat_t\n        # Output gate\n        o_t = torch.sigmoid(self.Wo(combined))\n        # Current hidden state\n        h_t = o_t * torch.tanh(c_t)\n        \n        return h_t, c_t\n    <\/code><\/pre>\n<h3>3.2 Testing LSTM Cell<\/h3>\n<p>\n        Now we will write a simple example to test the LSTM cell. This example shows the process of using the LSTM cell on a randomly generated input sequence.\n    <\/p>\n<pre><code>\n# Random input parameters\ninput_size = 4\nhidden_size = 3\nsequence_length = 5\n\n# Initialize LSTM Cell\nlstm_cell = LSTMCell(input_size, hidden_size)\n\n# Initialize hidden states and cell states\nh_t = torch.zeros(1, hidden_size)\nc_t = torch.zeros(1, hidden_size)\n\n# Random input sequence\ninput_sequence = torch.randn(sequence_length, 1, input_size)\n\nfor x in input_sequence:\n    h_t, c_t = lstm_cell(x, (h_t, c_t))\n    print(f'Current hidden state: {h_t}')\n    print(f'Current cell state: {c_t}')\n    print('---')\n    <\/code><\/pre>\n<h3>3.3 Building an LSTM Model<\/h3>\n<p>\n        Beyond constructing the LSTM cell, let&#8217;s build an LSTM model to process actual data.<br \/>\n        The model&#8217;s input is sequence data, and the output is the prediction results of the sequence.\n    <\/p>\n<pre><code>\nclass LSTMModel(nn.Module):\n    def __init__(self, input_size, hidden_size, output_size):\n        super(LSTMModel, self).__init__()\n        self.lstm_cell = LSTMCell(input_size, hidden_size)\n        self.fc = nn.Linear(hidden_size, output_size)\n        \n    def forward(self, x):\n        h_t = torch.zeros(1, self.lstm_cell.hidden_size)\n        c_t = torch.zeros(1, self.lstm_cell.hidden_size)\n        \n        outputs = []\n        for seq in x:\n            h_t, c_t = self.lstm_cell(seq, (h_t, c_t))\n            outputs.append(h_t)\n        \n        outputs = torch.stack(outputs)\n        return self.fc(outputs[-1])  # Only take the last hidden state for predictions\n    <\/code><\/pre>\n<h2>4. Training the LSTM Model<\/h2>\n<p>\n        Now we will explain how to train the model. The general training process is as follows:<\/p>\n<ol>\n<li>Data preparation: Prepare the input sequences and their corresponding labels.<\/li>\n<li>Model initialization: Initialize the LSTM model.<\/li>\n<li>Set the loss function and optimizer: Set the loss function and optimization algorithm.<\/li>\n<li>Training loop: Train the model repeatedly.<\/li>\n<\/ol>\n<p>        The code below is an example that implements the above process.\n    <\/p>\n<pre><code>\n# Define the model parameters\ninput_size = 4\nhidden_size = 3\noutput_size = 1\nnum_epochs = 100\nlearning_rate = 0.01\n\n# Initialize the LSTM Model\nmodel = LSTMModel(input_size, hidden_size, output_size)\n\n# Define loss function and optimizer\ncriterion = nn.MSELoss()\noptimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)\n\n# Dummy dataset (random input and target values for demonstration)\nX = torch.randn(100, 5, 4)  # 100 sequences of length 5, each with 4 features\ny = torch.randn(100, 1)      # 100 target values\n\n# Training loop\nfor epoch in range(num_epochs):\n    model.train()\n    \n    optimizer.zero_grad()  # Gradient zeroing\n    outputs = model(X)     # Forward pass\n    loss = criterion(outputs, y)  # Calculate loss\n    loss.backward()        # Backward pass\n    optimizer.step()       # Update parameters\n    \n    if (epoch+1) % 10 == 0:\n        print(f'Epoch [{epoch+1}\/{num_epochs}], Loss: {loss.item():.4f}')\n    <\/code><\/pre>\n<h2>5. Conclusion<\/h2>\n<p>\n        In this lecture, we implemented the LSTM cell and model using PyTorch, and we explored the entire flow including functions and training loops.<br \/>\n        LSTM is very useful for processing time series data, and can be applied in various fields such as natural language processing, stock price prediction, and speech recognition.<br \/>\n        Understanding the concepts of deep learning, RNNs, and LSTMs will enable you to handle more complex models easily. The next steps could involve learning about GRUs and deeper neural network architectures.\n    <\/p>\n<h2>6. Additional Learning Materials<\/h2>\n<p>\n        &#8211; <a href=\"https:\/\/pytorch.org\/docs\/stable\/generated\/torch.nn.LSTM.html\" target=\"_blank\" rel=\"noopener\">PyTorch LSTM Documentation<\/a><br \/>\n        &#8211; <a href=\"https:\/\/colah.github.io\/posts\/2015-08-Understanding-LSTMs\/\" target=\"_blank\" rel=\"noopener\">Understanding LSTMs (Jay Alammar)<\/a><br \/>\n        &#8211; <a href=\"https:\/\/www.deeplearningbook.org\/\" target=\"_blank\" rel=\"noopener\">Deep Learning Book (Ian Goodfellow)<\/a>\n<\/p>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Deep learning has received a lot of attention in recent years, and in particular, recurrent neural networks (RNNs) are very useful for processing sequences of data such as time series data or natural language processing (NLP). A type of RNN called Long Short-Term Memory (LSTM) networks is designed to address the long-term dependency problem of &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36455\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Deep Learning PyTorch Course, LSTM Cell Implementation&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[149],"tags":[],"class_list":["post-36455","post","type-post","status-publish","format-standard","hentry","category-pytorch-study"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36455\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"Deep learning has received a lot of attention in recent years, and in particular, recurrent neural networks (RNNs) are very useful for processing sequences of data such as time series data or natural language processing (NLP). A type of RNN called Long Short-Term Memory (LSTM) networks is designed to address the long-term dependency problem of &hellip; \ub354 \ubcf4\uae30 &quot;Deep Learning PyTorch Course, LSTM Cell Implementation&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36455\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:48:37+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-11-01T11:53:07+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"5\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Deep Learning PyTorch Course, LSTM Cell Implementation\",\"datePublished\":\"2024-11-01T09:48:37+00:00\",\"dateModified\":\"2024-11-01T11:53:07+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/\"},\"wordCount\":559,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"PyTorch Study\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36455\/\",\"name\":\"Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:48:37+00:00\",\"dateModified\":\"2024-11-01T11:53:07+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36455\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36455\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Deep Learning PyTorch Course, LSTM Cell Implementation\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36455\/","og_locale":"ko_KR","og_type":"article","og_title":"Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"Deep learning has received a lot of attention in recent years, and in particular, recurrent neural networks (RNNs) are very useful for processing sequences of data such as time series data or natural language processing (NLP). A type of RNN called Long Short-Term Memory (LSTM) networks is designed to address the long-term dependency problem of &hellip; \ub354 \ubcf4\uae30 \"Deep Learning PyTorch Course, LSTM Cell Implementation\"","og_url":"https:\/\/atmokpo.com\/w\/36455\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:48:37+00:00","article_modified_time":"2024-11-01T11:53:07+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"5\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36455\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36455\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Deep Learning PyTorch Course, LSTM Cell Implementation","datePublished":"2024-11-01T09:48:37+00:00","dateModified":"2024-11-01T11:53:07+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36455\/"},"wordCount":559,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["PyTorch Study"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36455\/","url":"https:\/\/atmokpo.com\/w\/36455\/","name":"Deep Learning PyTorch Course, LSTM Cell Implementation - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:48:37+00:00","dateModified":"2024-11-01T11:53:07+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36455\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36455\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36455\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Deep Learning PyTorch Course, LSTM Cell Implementation"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36455","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36455"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36455\/revisions"}],"predecessor-version":[{"id":36456,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36455\/revisions\/36456"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36455"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36455"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36455"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}