{"id":36511,"date":"2024-11-01T09:49:06","date_gmt":"2024-11-01T09:49:06","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36511"},"modified":"2024-11-01T11:52:54","modified_gmt":"2024-11-01T11:52:54","slug":"deep-learning-pytorch-course-types-of-generative-models","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36511\/","title":{"rendered":"Deep Learning PyTorch Course, Types of Generative Models"},"content":{"rendered":"<p><body><\/p>\n<p>Deep learning has shown remarkable advancements in recent years, significantly impacting various fields. Among them, generative models are gaining attention due to their ability to create data samples. In this article, we will explore various types of generative models, explain how each model works, and provide example code using PyTorch.<\/p>\n<h2>What is a Generative Model?<\/h2>\n<p>A generative model is a machine learning model that generates new samples from a given data distribution. It can create new data that is similar to the given data but does not exist in the actual data. Generative models are primarily used in various fields such as image generation, text generation, and music generation. The main types of generative models include:<\/p>\n<h2>1. Autoencoders<\/h2>\n<p>Autoencoders are artificial neural networks that operate by compressing input data and reconstructing the input data from the compressed representation. Autoencoders can generate data through a latent space.<\/p>\n<h3>Structure of Autoencoders<\/h3>\n<p>Autoencoders can be broadly divided into two parts:<\/p>\n<ul>\n<li><b>Encoder:<\/b> Maps input data to a latent representation.<\/li>\n<li><b>Decoder:<\/b> Reconstructs the original data from the latent representation.<\/li>\n<\/ul>\n<h3>Creating an Autoencoder with PyTorch<\/h3>\n<pre>\nimport torch\nimport torch.nn as nn\nimport torch.optim as optim\nfrom torchvision import datasets, transforms\nfrom torch.utils.data import DataLoader\n\n# Data preprocessing\ntransform = transforms.Compose([\n    transforms.ToTensor(),\n    transforms.Normalize((0.5,), (0.5,))\n])\n\n# Load MNIST dataset\ntrain_dataset = datasets.MNIST(root='.\/data', train=True, transform=transform, download=True)\ntrain_loader = DataLoader(dataset=train_dataset, batch_size=64, shuffle=True)\n\n# Define the autoencoder model\nclass Autoencoder(nn.Module):\n    def __init__(self):\n        super(Autoencoder, self).__init__()\n        self.encoder = nn.Sequential(\n            nn.Linear(784, 256),\n            nn.ReLU(),\n            nn.Linear(256, 64)\n        )\n        self.decoder = nn.Sequential(\n            nn.Linear(64, 256),\n            nn.ReLU(),\n            nn.Linear(256, 784),\n            nn.Sigmoid()\n        )\n\n    def forward(self, x):\n        x = x.view(-1, 784)  # 28*28 = 784\n        encoded = self.encoder(x)\n        decoded = self.decoder(encoded)\n        return decoded\n\n# Define model, loss function, and optimizer\nmodel = Autoencoder()\ncriterion = nn.BCELoss()\noptimizer = optim.Adam(model.parameters(), lr=0.001)\n\n# Training\nnum_epochs = 10\nfor epoch in range(num_epochs):\n    for data in train_loader:\n        img, _ = data\n        optimizer.zero_grad()\n        output = model(img)\n        loss = criterion(output, img.view(-1, 784))\n        loss.backward()\n        optimizer.step()\n    print(f'Epoch [{epoch+1}\/{num_epochs}], Loss: {loss.item():.4f}')\n    <\/pre>\n<p>The above code is a simple example that trains an autoencoder on MNIST data. The encoder compresses 784 input nodes to 64 latent variables, and the decoder restores them back to 784 outputs.<\/p>\n<h2>2. Generative Adversarial Networks (GANs)<\/h2>\n<p>GANs are structured in a way where two neural networks, a generator and a discriminator, learn competitively. The generator creates fake data that resembles real data, and the discriminator determines whether the data is real or fake.<\/p>\n<h3>How GANs Work<\/h3>\n<p>The training process of GANs proceeds as follows:<\/p>\n<ol>\n<li>The generator takes random noise as input and generates fake images.<\/li>\n<li>The discriminator takes real images and the generated images as input and judges the authenticity of the two types of images.<\/li>\n<li>The more accurately the discriminator identifies fake images, the more the generator learns to create refined images.<\/li>\n<\/ol>\n<h3>Creating a GAN Model with PyTorch<\/h3>\n<pre>\nclass Generator(nn.Module):\n    def __init__(self):\n        super(Generator, self).__init__()\n        self.model = nn.Sequential(\n            nn.Linear(100, 256),\n            nn.ReLU(),\n            nn.Linear(256, 512),\n            nn.ReLU(),\n            nn.Linear(512, 1024),\n            nn.ReLU(),\n            nn.Linear(1024, 784),\n            nn.Tanh()\n        )\n\n    def forward(self, x):\n        return self.model(x)\n\nclass Discriminator(nn.Module):\n    def __init__(self):\n        super(Discriminator, self).__init__()\n        self.model = nn.Sequential(\n            nn.Linear(784, 512),\n            nn.LeakyReLU(0.2),\n            nn.Linear(512, 256),\n            nn.LeakyReLU(0.2),\n            nn.Linear(256, 1),\n            nn.Sigmoid()\n        )\n\n    def forward(self, x):\n        return self.model(x)\n\n# Create model instances\ngenerator = Generator()\ndiscriminator = Discriminator()\n\n# Define loss function and optimizers\ncriterion = nn.BCELoss()\noptimizer_g = optim.Adam(generator.parameters(), lr=0.0002)\noptimizer_d = optim.Adam(discriminator.parameters(), lr=0.0002)\n\n# Training process\nnum_epochs = 100\nfor epoch in range(num_epochs):\n    for data in train_loader:\n        real_images, _ = data\n        real_labels = torch.ones(real_images.size(0), 1)\n        fake_labels = torch.zeros(real_images.size(0), 1)\n\n        # Discriminator training\n        optimizer_d.zero_grad()\n        outputs = discriminator(real_images.view(-1, 784))\n        d_loss_real = criterion(outputs, real_labels)\n        d_loss_real.backward()\n\n        noise = torch.randn(real_images.size(0), 100)\n        fake_images = generator(noise)\n        outputs = discriminator(fake_images.detach())\n        d_loss_fake = criterion(outputs, fake_labels)\n        d_loss_fake.backward()\n\n        optimizer_d.step()\n\n        # Generator training\n        optimizer_g.zero_grad()\n        outputs = discriminator(fake_images)\n        g_loss = criterion(outputs, real_labels)\n        g_loss.backward()\n        optimizer_g.step()\n\n    print(f'Epoch [{epoch+1}\/{num_epochs}], d_loss: {d_loss_real.item() + d_loss_fake.item():.4f}, g_loss: {g_loss.item():.4f}')\n    <\/pre>\n<p>The above code is a basic example of implementing GANs. The Generator takes a 100-dimensional random noise input and generates a 784-dimensional image, while the Discriminator judges these images.<\/p>\n<h2>3. Variational Autoencoders (VAEs)<\/h2>\n<p>VAEs are an extension of autoencoders and are generative models. VAEs learn the latent distribution of the data to generate new samples. They can sample latent variables of different data points to create diverse samples.<\/p>\n<h3>Structure of VAEs<\/h3>\n<p>VAEs use variational estimation techniques to map input data to a latent space. VAEs consist of an encoder and a decoder, where the encoder maps the input data to mean and variance, and generates data points through sampling processes.<\/p>\n<h3>Creating a VAE Model with PyTorch<\/h3>\n<pre>\nclass VAE(nn.Module):\n    def __init__(self):\n        super(VAE, self).__init__()\n        self.encoder = nn.Sequential(\n            nn.Linear(784, 256),\n            nn.ReLU(),\n            nn.Linear(256, 128),\n            nn.ReLU()\n        )\n        self.fc_mean = nn.Linear(128, 20)\n        self.fc_logvar = nn.Linear(128, 20)\n        self.decoder = nn.Sequential(\n            nn.Linear(20, 128),\n            nn.ReLU(),\n            nn.Linear(128, 256),\n            nn.ReLU(),\n            nn.Linear(256, 784),\n            nn.Sigmoid()\n        )\n\n    def encode(self, x):\n        h = self.encoder(x.view(-1, 784))\n        return self.fc_mean(h), self.fc_logvar(h)\n\n    def reparameterize(self, mu, logvar):\n        std = torch.exp(0.5 * logvar)\n        eps = torch.randn_like(std)\n        return mu + eps * std\n\n    def decode(self, z):\n        return self.decoder(z)\n\n    def forward(self, x):\n        mu, logvar = self.encode(x)\n        z = self.reparameterize(mu, logvar)\n        return self.decode(z), mu, logvar\n\n# Define loss function\ndef loss_function(recon_x, x, mu, logvar):\n    BCE = nn.functional.binary_cross_entropy(recon_x, x.view(-1, 784), reduction='sum')\n    KLD = -0.5 * torch.sum(1 + logvar - mu.pow(2) - logvar.exp())\n    return BCE + KLD\n\n# Initialize model and training process\nmodel = VAE()\noptimizer = optim.Adam(model.parameters(), lr=0.001)\n\n# Training process\nnum_epochs = 10\nfor epoch in range(num_epochs):\n    for data in train_loader:\n        img, _ = data\n        optimizer.zero_grad()\n        recon_batch, mu, logvar = model(img)\n        loss = loss_function(recon_batch, img, mu, logvar)\n        loss.backward()\n        optimizer.step()\n    print(f'Epoch [{epoch+1}\/{num_epochs}], Loss: {loss.item():.4f}')\n    <\/pre>\n<h2>4. Research Trends and Conclusion<\/h2>\n<p>Generative models enable the generation of reliable data, making them applicable in various fields. GANs, VAEs, and autoencoders are widely used in applications such as image generation, video generation, and text generation. These models maximize the potential for use in data science and artificial intelligence, along with deep learning.<\/p>\n<p>As deep learning technologies continue to evolve, generative models are also advancing. Further experiments and research based on the basic concepts and examples covered in this article are necessary.<\/p>\n<p>If you wish to delve deeper into the potential applications of generative models through deep learning, it is recommended to refer to papers or advanced learning materials for more case studies.<\/p>\n<p>Hope this post helps in understanding generative models and appreciating the allure of deep learning.<\/p>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Deep learning has shown remarkable advancements in recent years, significantly impacting various fields. Among them, generative models are gaining attention due to their ability to create data samples. In this article, we will explore various types of generative models, explain how each model works, and provide example code using PyTorch. What is a Generative Model? &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36511\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Deep Learning PyTorch Course, Types of Generative Models&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[149],"tags":[],"class_list":["post-36511","post","type-post","status-publish","format-standard","hentry","category-pytorch-study"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36511\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"Deep learning has shown remarkable advancements in recent years, significantly impacting various fields. Among them, generative models are gaining attention due to their ability to create data samples. In this article, we will explore various types of generative models, explain how each model works, and provide example code using PyTorch. What is a Generative Model? &hellip; \ub354 \ubcf4\uae30 &quot;Deep Learning PyTorch Course, Types of Generative Models&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36511\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:49:06+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-11-01T11:52:54+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"6\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Deep Learning PyTorch Course, Types of Generative Models\",\"datePublished\":\"2024-11-01T09:49:06+00:00\",\"dateModified\":\"2024-11-01T11:52:54+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/\"},\"wordCount\":566,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"PyTorch Study\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36511\/\",\"name\":\"Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:49:06+00:00\",\"dateModified\":\"2024-11-01T11:52:54+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36511\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36511\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Deep Learning PyTorch Course, Types of Generative Models\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36511\/","og_locale":"ko_KR","og_type":"article","og_title":"Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"Deep learning has shown remarkable advancements in recent years, significantly impacting various fields. Among them, generative models are gaining attention due to their ability to create data samples. In this article, we will explore various types of generative models, explain how each model works, and provide example code using PyTorch. What is a Generative Model? &hellip; \ub354 \ubcf4\uae30 \"Deep Learning PyTorch Course, Types of Generative Models\"","og_url":"https:\/\/atmokpo.com\/w\/36511\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:49:06+00:00","article_modified_time":"2024-11-01T11:52:54+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"6\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36511\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36511\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Deep Learning PyTorch Course, Types of Generative Models","datePublished":"2024-11-01T09:49:06+00:00","dateModified":"2024-11-01T11:52:54+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36511\/"},"wordCount":566,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["PyTorch Study"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36511\/","url":"https:\/\/atmokpo.com\/w\/36511\/","name":"Deep Learning PyTorch Course, Types of Generative Models - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:49:06+00:00","dateModified":"2024-11-01T11:52:54+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36511\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36511\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36511\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Deep Learning PyTorch Course, Types of Generative Models"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36511","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36511"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36511\/revisions"}],"predecessor-version":[{"id":36512,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36511\/revisions\/36512"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36511"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36511"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36511"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}