{"id":1543,"date":"2023-03-18T22:13:00","date_gmt":"2023-03-18T20:13:00","guid":{"rendered":"https:\/\/www.ayhanarda.com\/blog\/?p=1543"},"modified":"2023-03-18T22:14:04","modified_gmt":"2023-03-18T20:14:04","slug":"sqoop-nedir","status":"publish","type":"post","link":"https:\/\/www.ayhanarda.com\/blog\/2023\/03\/sqoop-nedir\/","title":{"rendered":"Sqoop Nedir?"},"content":{"rendered":"\n<h4 class=\"wp-block-heading\"><strong>Sqoop Nedir?<\/strong><\/h4>\n\n\n\n<p>Apache Sqoop, ili\u015fkisel veri tabanlar\u0131 ile hadoop aras\u0131nda b\u00fcy\u00fck verilerin aktar\u0131m\u0131n\u0131 sa\u011flayan java tabanl\u0131 bir yaz\u0131l\u0131md\u0131r. Her iki y\u00f6nde aktar\u0131m yapabilmektedir. Veriyi ili\u015fkisel veri taban\u0131ndan okuyup (Oracle, MySQL, SQL Server, Postgres, Teradata vs) hadoop da\u011f\u0131t\u0131k dosya sistemine (HDFS, Hive, Hbase \u2026) aktarabildi\u011fi gibi hadoop ortam\u0131ndan okuyup ili\u015fkisel veri tabanlar\u0131na da yaz\u0131labilmesini sa\u011flar.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><a href=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-1.png\"><img loading=\"lazy\" decoding=\"async\" width=\"972\" height=\"282\" src=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-1.png\" alt=\"\" class=\"wp-image-1547\" srcset=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-1.png 972w, https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-1-300x87.png 300w, https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-1-768x223.png 768w\" sizes=\"auto, (max-width: 972px) 100vw, 972px\" \/><\/a><\/figure>\n\n\n\n<p>Sqoop, &#8220;SQL&#8217;den Hadoop&#8217;a ve Hadoop&#8217;tan SQL&#8217;e&#8221; anlam\u0131na gelir ve Cloudera taraf\u0131ndan geli\u015ftirilmi\u015ftir.<\/p>\n\n\n\n<p>Sqoop\u2019un avantaj\u0131 ise veri aktar\u0131m i\u015flemlerini MapReduce g\u00f6revleri ile paralel olarak yaparak aktar\u0131m\u0131 \u00e7ok daha h\u0131zl\u0131 tamamlamak. Ayr\u0131ca MySQL ve PostgreSQL i\u00e7in JDBC kullanmadan (mysqldump gibi) daha d\u00fc\u015f\u00fck seviyeli ve performansl\u0131 veri aktar\u0131m\u0131 da yap\u0131labiliyor.<\/p>\n\n\n\n<p>\u0130mport i\u015flemi s\u0131ras\u0131nda Sqoop basit\u00e7e meta verilerden faydalanarak tablonun birincil anahtar\u0131n\u0131 bulup minimum ve maksimum de\u011ferlerini alarak e\u015fit olarak Map say\u0131s\u0131na uygun olarak b\u00f6lerek farkl\u0131 d\u00fc\u011f\u00fcmler \u00fczerinde bu verileri paralel olarak aktar\u0131r. Bu y\u00fczden sonu\u00e7 klas\u00f6r i\u00e7inde birden fazla dosyaya yaz\u0131l\u0131r. Aktar\u0131m s\u0131ras\u0131nda veritaban\u0131na yeni kay\u0131t geliyorsa bunlar aktar\u0131lmayabilir, veri tutarl\u0131l\u0131\u011f\u0131na dikkat etmek gerekir.<\/p>\n\n\n\n<p>Export i\u015flemi s\u0131ras\u0131nda ise aksi belirtilmedi\u011fi s\u00fcrece verileri binerli gruplar halinde veritaban\u0131na INSERT ediyor. Bu i\u015flem de paralel olarak yap\u0131ld\u0131\u011f\u0131 i\u00e7in aktar\u0131m s\u0131ras\u0131nda veritaban\u0131nda y\u00fck olu\u015fturabilir. Her bir grubun yaz\u0131lmas\u0131 kendi ba\u015f\u0131na bir transaction oldu\u011fu i\u00e7in burada da veri tutarl\u0131l\u0131\u011f\u0131na dikkat etmek gerekir. E\u011fer kay\u0131tlar\u0131n tamam\u0131 aktar\u0131ld\u0131ktan sonra aktif olmas\u0131 isteniyorsa ara tablo kullan\u0131m\u0131n\u0131 sa\u011flayan \u2013staging-table parametresi kullan\u0131labilir. Bu tablonun yarat\u0131lmas\u0131 ve temizlenmesi de otomatik yap\u0131lmaz, elle yapmak gerekir.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><a href=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image.png\"><img loading=\"lazy\" decoding=\"async\" width=\"600\" height=\"280\" src=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image.png\" alt=\"\" class=\"wp-image-1544\" srcset=\"https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image.png 600w, https:\/\/www.ayhanarda.com\/blog\/wp-content\/uploads\/2023\/03\/image-300x140.png 300w\" sizes=\"auto, (max-width: 600px) 100vw, 600px\" \/><\/a><figcaption class=\"wp-element-caption\">Sqoop<\/figcaption><\/figure>\n\n\n\n<p>Hadoop geli\u015ftiricisi i\u00e7in as\u0131l olay veriler HDFS&#8217;ye y\u00fcklendikten sonra ba\u015flar. HDFS&#8217;de depolanan verilerde gizlenmi\u015f \u00e7e\u015fitli \u00f6n g\u00f6r\u00fcler elde etmek i\u00e7in bu veriler etraf\u0131nda i\u015flemler yap\u0131l\u0131r.<\/p>\n\n\n\n<p>Bu nedenle, bu analiz i\u00e7in ili\u015fkisel veri taban\u0131 y\u00f6netim sistemlerinde bulunan verilerin HDFS&#8217;ye aktar\u0131lmas\u0131 gerekir. \u0130li\u015fkisel veritaban\u0131ndan HDFS&#8217;ye veri almak ve vermek i\u00e7in MapReduce kodunu yazma g\u00f6revi \u00e7ok ilgi \u00e7ekici de\u011fildir ve s\u0131k\u0131c\u0131d\u0131r. Apache Sqoop&#8217;un kurtarmaya geldi\u011fi ve ac\u0131lar\u0131n\u0131 dindirdi\u011fi yer buras\u0131d\u0131r. Verileri i\u00e7e ve d\u0131\u015fa aktarma s\u00fcrecini otomatikle\u015ftirir.<\/p>\n\n\n\n<p>Sqoop, verileri i\u00e7e ve d\u0131\u015fa aktarmak i\u00e7in CLI sa\u011flayarak geli\u015ftiricilerin hayat\u0131n\u0131 kolayla\u015ft\u0131r\u0131r. Sadece veritaban\u0131 kimlik do\u011frulamas\u0131, kaynak, hedef, i\u015flemler vb. gibi temel bilgileri sa\u011flamalar\u0131 gerekir. Geri kalan k\u0131sm\u0131 o halleder.<\/p>\n\n\n\n<p>Sqoop, komutu dahili olarak, daha sonra HDFS \u00fczerinden y\u00fcr\u00fct\u00fclen MapReduce g\u00f6revlerine d\u00f6n\u00fc\u015ft\u00fcr\u00fcr. Paralellik \u00fczerinde hata tolerans\u0131 sa\u011flayan verileri i\u00e7e ve d\u0131\u015fa aktarmak i\u00e7in YARN \u00e7er\u00e7evesini kullan\u0131r.<\/p>\n\n\n\n<p><strong>Sqoop \u00d6zellikleri<\/strong><\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Sa\u011flaml\u0131k: Apache Sqoop do\u011fas\u0131 gere\u011fi olduk\u00e7a sa\u011flamd\u0131r. Topluluk deste\u011fi ve katk\u0131s\u0131 vard\u0131r ve kolayca kullan\u0131labilir.<\/li>\n\n\n\n<li>Tam Y\u00fck: Sqoop kullanarak, tek bir Sqoop komutuyla t\u00fcm tabloyu y\u00fckleyebiliriz. Sqoop ayr\u0131ca tek bir Sqoop komutu kullanarak veritaban\u0131ndaki t\u00fcm tablolar\u0131 y\u00fcklememizi sa\u011flar.<\/li>\n\n\n\n<li>Art\u0131ml\u0131 Y\u00fck: Sqoop, art\u0131ml\u0131 y\u00fck i\u015flevselli\u011fini destekler. Sqoop&#8217;u kullanarak, g\u00fcncellendi\u011finde tablonun b\u00f6l\u00fcmlerini y\u00fckleyebiliriz.<\/li>\n\n\n\n<li>Paralel i\u00e7e\/d\u0131\u015fa aktarma: Apache Sqoop, verileri i\u00e7e ve d\u0131\u015fa aktarmak i\u00e7in YARN \u00e7er\u00e7evesini kullan\u0131r. Bu, paralelli\u011fin \u00fcst\u00fcnde hata tolerans\u0131 sa\u011flar.<\/li>\n\n\n\n<li>SQL sorgusunun sonu\u00e7lar\u0131n\u0131 i\u00e7e aktar\u0131n: Sqoop, SQL sorgusundan d\u00f6nd\u00fcr\u00fclen sonucu Hadoop Da\u011f\u0131t\u0131lm\u0131\u015f Dosya Sistemine i\u00e7e aktarmam\u0131za da izin verir.<\/li>\n\n\n\n<li>S\u0131k\u0131\u015ft\u0131rma: -compress arg\u00fcman\u0131 ile deflate(gzip) algoritmas\u0131n\u0131 kullanarak veya \u2013compression-codec arg\u00fcman\u0131n\u0131 belirterek verilerimizi s\u0131k\u0131\u015ft\u0131rabiliriz. Apache Hive&#8217;a s\u0131k\u0131\u015ft\u0131r\u0131lm\u0131\u015f bir tablo y\u00fckleyebiliriz.<\/li>\n\n\n\n<li>T\u00fcm \u00f6nemli RDBMS Veritabanlar\u0131 i\u00e7in konnekt\u00f6rler: Sqoop, \u00e7e\u015fitli RDBMS veritabanlar\u0131 i\u00e7in neredeyse t\u00fcm ekosistemi kapsayan konnekt\u00f6rler sa\u011flar.<\/li>\n\n\n\n<li>Kerberos G\u00fcvenlik Entegrasyonu: Temel olarak Kerberos, g\u00fcvenli olmayan a\u011f \u00fczerinden ileti\u015fim kuran d\u00fc\u011f\u00fcmlerin kimliklerini birbirlerine kan\u0131tlamalar\u0131na izin vermek i\u00e7in ticketlar temelinde \u00e7al\u0131\u015fan bilgisayar a\u011f\u0131 kimlik do\u011frulama protokol\u00fcd\u00fcr. Apache Sqoop, Kerberos kimlik do\u011frulamas\u0131n\u0131 destekler.<\/li>\n\n\n\n<li>Verileri do\u011frudan HIVE\/HBase&#8217;e y\u00fckleyin: Sqoop&#8217;u kullanarak, veri analizi i\u00e7in verileri do\u011frudan Hive&#8217;a y\u00fckleyebiliriz. Verilerimizi HBase yani NoSQL veritaban\u0131na da dump edebiliriz.<\/li>\n\n\n\n<li>Accumulo Deste\u011fi: Apache Sqoop&#8217;a bir tabloyu HDFS&#8217;deki bir dizine almak yerine Accumulo&#8217;ya almas\u0131 talimat\u0131n\u0131 verebiliriz.<\/li>\n<\/ol>\n\n\n\n<p><strong>Sqoop&#8217;un S\u0131n\u0131rlamalar\u0131<\/strong><\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Apache Sqoop&#8217;u duraklatamaz ya da devam ettiremeyiz. atomik bir ad\u0131md\u0131r.<\/li>\n\n\n\n<li>Sqoop Export&#8217;un performans\u0131, RDBMS sunucusunun donan\u0131m ya da config yap\u0131land\u0131rmas\u0131na ba\u011fl\u0131d\u0131r.<\/li>\n\n\n\n<li>Sqoop MapReduce paradigmas\u0131n\u0131 kullan\u0131r.<\/li>\n\n\n\n<li>\u0130mport ya da export s\u0131ras\u0131ndaki problemler i\u00e7in \u00f6zel i\u015flem gerektirir.<\/li>\n\n\n\n<li>Her rdbms konnekt\u00f6r\u00fc ayn\u0131 de\u011fildir , baz\u0131 veritabanlar\u0131 i\u00e7in daha bulk i\u015flemler sa\u011flar.<\/li>\n<\/ol>\n\n\n\n<p><strong>Sqoop Arg\u00fcmanlar\u0131<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-table\"><table><thead><tr><th>Arg\u00fcman<\/th><th>A\u00e7\u0131klama<\/th><\/tr><\/thead><tbody><tr><td><code>--append<\/code><\/td><td>Import edilen veriyi HDFS\u2019deki mevcut veriye ekler<\/td><\/tr><tr><td><code>--as-avrodatafile<\/code><\/td><td>Veriyi Avro Data Files olarak import eder<\/td><\/tr><tr><td><code>--as-sequencefile<\/code><\/td><td>Veriyi SequenceFiles olarak import eder<\/td><\/tr><tr><td><code>--as-textfile<\/code><\/td><td>Veriyi plain text olarak import eder. Varsay\u0131lan format budur.<\/td><\/tr><tr><td><code>--as-parquetfile<\/code><\/td><td>Veriyi Parquet Files olarak import eder<\/td><\/tr><tr><td><code>--boundary-query &lt;statement&gt;<\/code><\/td><td>Sorguya kullan\u0131c\u0131 taraf\u0131ndan konan s\u0131n\u0131rd\u0131r.<\/td><\/tr><tr><td><code>--columns &lt;col,col,col\u2026&gt;<\/code><\/td><td>Tablonun hangi s\u00fctunlar\u0131n\u0131 se\u00e7ece\u011fimizi bildiririz.<\/td><\/tr><tr><td><code>--delete-target-dir<\/code><\/td><td>E\u011fer hedefte b\u00f6yle bir dizin varsa onu siler.<\/td><\/tr><tr><td><code>--direct<\/code><\/td><td>Veri taban\u0131 i\u00e7in varsa do\u011frudan ba\u011flant\u0131 kullan\u0131r.<\/td><\/tr><tr><td><code>--fetch-size &lt;n&gt;<\/code><\/td><td>\u00c7a\u011fr\u0131lacak sat\u0131r say\u0131s\u0131na s\u0131n\u0131r koyulur. 50 sat\u0131r getir gibi.<\/td><\/tr><tr><td><code>--inline-lob-limit &lt;n&gt;<\/code><\/td><td>Bir sat\u0131rda bulunacak maksimum LOB geni\u015fli\u011fi.<\/td><\/tr><tr><td><code>-m,--num-mappers &lt;n&gt;<\/code><\/td><td>Ka\u00e7 tane paralel map g\u00f6revinin paralel \u00e7al\u0131\u015faca\u011f\u0131n\u0131 belirten rakamd\u0131r.<\/td><\/tr><tr><td><code>-e,--query &lt;statement&gt;<\/code><\/td><td>Import the results of&nbsp;<em><code>statement<\/code><\/em>.<\/td><\/tr><tr><td><code>--split-by &lt;column-name&gt;<\/code><\/td><td>Tablonun hangi s\u00fctun baz al\u0131narak b\u00f6l\u00fcnece\u011fini belirtir. &nbsp;<code>--autoreset-to-one-mapper<\/code>&nbsp;ile ayn\u0131 anda kullan\u0131lamaz.<\/td><\/tr><tr><td><code>--autoreset-to-one-mapper<\/code><\/td><td>E\u011fer tabloda primary key yok ise ve b\u00f6l\u00fcnecek s\u00fctun da belirtilmemi\u015fse tek bir mapper kullan\u0131l\u0131r. &nbsp;Ayn\u0131 anda&nbsp;<code>--split-by &lt;col&gt;<\/code>&nbsp;se\u00e7ene\u011fi ile kullan\u0131lamaz.<\/td><\/tr><tr><td><code>--table &lt;table-name&gt;<\/code><\/td><td>Okunmak istenen hedef tablonun ismi<\/td><\/tr><tr><td><code>--target-dir &lt;dir&gt;<\/code><\/td><td>Hedef HDFS dizini<\/td><\/tr><tr><td><code>--warehouse-dir &lt;dir&gt;<\/code><\/td><td>HDFS parent for table destination<\/td><\/tr><tr><td><code>--where &lt;where clause&gt;<\/code><\/td><td>E\u011fer import esnas\u0131nda WHERE &nbsp;ko\u015fulu kullanmak istiyorsak.<\/td><\/tr><tr><td><code>-z,--compress<\/code><\/td><td>S\u0131k\u0131\u015ft\u0131rmay\u0131 a\u00e7mak istiyorsak<\/td><\/tr><tr><td><code>--compression-codec &lt;c&gt;<\/code><\/td><td>Hadoop codec kullan\u0131m\u0131 (varsay\u0131lan gzip)<\/td><\/tr><tr><td><code>--null-string &lt;null-string&gt;<\/code><\/td><td>S\u00fctun string ise ve null ise bunun yerine yaz\u0131lacak string. \u00d6rne\u011fin null geldininde \u201cBo\u015f\u201d yaz\u0131lmas\u0131n\u0131 istersek.<\/td><\/tr><tr><td><code>--null-non-string &lt;null-string&gt;<\/code><\/td><td>S\u00fctun string de\u011fil ise null de\u011ferler i\u00e7in alacak de\u011fer<\/td><\/tr><\/tbody><\/table><figcaption class=\"wp-element-caption\">Sqoop Arg\u00fcmanlar\u0131<\/figcaption><\/figure>\n\n\n\n<p><strong>\u00d6rnekler;<\/strong><\/p>\n\n\n\n<p><strong>Mysql sunucusundaki personeller veritaban\u0131ndaki tablolar\u0131 listeleyelim.<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code><strong>sqoop list-tables --connect jdbc:mysql:\/\/MYSQLDBADRESINIZ:3306\/personel --username ayhanarda -P<\/strong>\nEnter password:\n23\/03\/15 19:59:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.\ndepartmanlar\ndepartman_calisanlari\ndepartman_yoneticileri\ncalisanlar\nunvanlar\nmaaslar\n\n<strong>\u00c7al\u0131\u015fanlar tablosunu HDFS \u00fczerine aktaral\u0131m.<\/strong>\n\nsqoop import --connect jdbc:mysql:\/\/<strong>MYSQLDBADRESINIZ<\/strong>:3306\/personel --username ayhanarda -P --table calisanlar\n\n<strong>Personel veritaban\u0131n\u0131 direkt Hive a alal\u0131m.<\/strong>\n\nsqoop import-all-tables --connect jdbc:mysql:\/\/<strong>MYSQLDBADRESINIZ<\/strong>:3306\/personel --username ayhanarda -P --direct --hive-import\n\nBu aktar\u0131m i\u015flemini \u2013direct parametresi ile de yapabilirdik.Bu durumda jdbc konnekt\u00f6r\u00fc yerine mysql dump kullan\u0131rd\u0131 ve s\u00fcre de\u011fi\u015firdi.\n\nSqoop ile tablonun tamam\u0131n\u0131 da aktarmak zorunda de\u011filiz, -query parametresi ile sadece verdi\u011fimiz sorgu sonucunun aktar\u0131lmas\u0131n\u0131 sa\u011flayabiliriz. Bunun d\u0131\u015f\u0131nda varsay\u0131lan olarak metin dosyas\u0131 haline aktar\u0131lan verilerin alan ve sat\u0131r ayra\u00e7lar\u0131n\u0131 belirleyebiliriz, metin dosyalar\u0131 d\u0131\u015f\u0131nda Hadoop\u2019un binary format\u0131 olan SequenceFile veya Avro format\u0131nda da veriler aktar\u0131labiliriz.<\/code><\/pre>\n\n\n\n<p><a href=\"https:\/\/sqoop.apache.org\/docs\/1.4.6\/SqoopUserGuide.html\" target=\"_blank\" rel=\"noopener\" title=\"\">Apache Sqoop Kullan\u0131c\u0131 Rehberi<\/a>ne eri\u015fmek i\u00e7in t\u0131klay\u0131n\u0131z.<br><a href=\"https:\/\/docs.cloudera.com\/sqoop\/1.4.7.7.1.6.0\/user-guide\/index.html\" target=\"_blank\" rel=\"noopener\" title=\"\">Cloudera Sqoop Rehberi<\/a>ne eri\u015fmek i\u00e7in t\u0131klay\u0131n\u0131z.<\/p>\n\n\n\n<p>Ayhan ARDA<\/p>\n\n\n\n<p><\/p>\n<div style=\"padding-bottom:20px; padding-top:10px;\" class=\"hupso-share-buttons\"><!-- Hupso Share Buttons - https:\/\/www.hupso.com\/share\/ --><a class=\"hupso_toolbar\" href=\"https:\/\/www.hupso.com\/share\/\"><img decoding=\"async\" src=\"https:\/\/static.hupso.com\/share\/buttons\/lang\/tr\/share-medium.png\" style=\"border:0px; padding-top: 5px; float:left;\" alt=\"Share Button\"\/><\/a><script type=\"text\/javascript\">var hupso_services_t=new Array(\"Twitter\",\"Facebook\",\"Google Plus\",\"Pinterest\",\"Linkedin\");var hupso_background_t=\"#EAF4FF\";var hupso_border_t=\"#66CCFF\";var hupso_toolbar_size_t=\"medium\";var hupso_image_folder_url = \"\";var hupso_twitter_via=\"ayhanarda\";var hupso_url_t=\"\";var hupso_title_t=\"Sqoop%20Nedir%3F\";<\/script><script type=\"text\/javascript\" src=\"https:\/\/static.hupso.com\/share\/js\/share_toolbar.js\"><\/script><!-- Hupso Share Buttons --><\/div>","protected":false},"excerpt":{"rendered":"Sqoop Nedir? Apache Sqoop, ili\u015fkisel veri tabanlar\u0131 ile hadoop aras\u0131nda b\u00fcy\u00fck verilerin aktar\u0131m\u0131n\u0131 sa\u011flayan java tabanl\u0131 bir yaz\u0131l\u0131md\u0131r. Her iki y\u00f6nde aktar\u0131m yapabilmektedir. Veriyi ili\u015fkisel veri taban\u0131ndan okuyup (Oracle, MySQL, SQL Server, Postgres, Teradata vs) hadoop da\u011f\u0131t\u0131k dosya sistemine (HDFS, Hive, Hbase \u2026) aktarabildi\u011fi gibi hadoop ortam\u0131ndan okuyup ili\u015fkisel veri tabanlar\u0131na da yaz\u0131labilmesini sa\u011flar. Sqoop, [&hellip;]","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":"","footnotes":""},"categories":[713,1517],"tags":[1519,1482,1520,1518,1522,1521,1523],"class_list":["post-1543","post","type-post","status-publish","format-standard","hentry","category-bigdata","category-sqoop","tag-apache-sqoop","tag-cloudera","tag-cloudera-sqoop","tag-sqoop","tag-sqoop-export","tag-sqoop-import","tag-sqoop-nedir"],"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/posts\/1543","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/comments?post=1543"}],"version-history":[{"count":8,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/posts\/1543\/revisions"}],"predecessor-version":[{"id":1554,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/posts\/1543\/revisions\/1554"}],"wp:attachment":[{"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/media?parent=1543"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/categories?post=1543"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.ayhanarda.com\/blog\/wp-json\/wp\/v2\/tags?post=1543"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}