[{"data":1,"prerenderedAt":1185},["ShallowReactive",2],{"article-alternates":3,"article-\u002Fes\u002Fai\u002Fmedicion-de-citas-llm":13},{"i18nKey":4,"paths":5},"ai-002-2026-05",{"de":6,"en":7,"es":8,"fr":9,"it":10,"ru":11,"tr":12},"\u002Fde\u002Fai\u002Fllm-zitierungsmetriken-seo","\u002Fen\u002Fai\u002Fllm-citation-measurement-new-seo-metrics","\u002Fes\u002Fai\u002Fmedicion-de-citas-llm","\u002Ffr\u002Fai\u002Fllm-citation-oelcuemue-yeni-seo-metrik-setiniz","\u002Fit\u002Fai\u002Fmisurazione-citazione-llm","\u002Fru\u002Fai\u002Fllm-citation-measurement-new-seo-metric","\u002Ftr\u002Fai\u002Fllm-citation-olcumu-yeni-seo-metrik-setiniz",{"_path":8,"_dir":14,"_draft":15,"_partial":15,"_locale":16,"title":17,"description":18,"publishedAt":19,"modifiedAt":19,"category":14,"i18nKey":4,"tags":20,"readingTime":26,"author":27,"body":28,"_type":920,"_id":1180,"_source":1181,"_file":1182,"_stem":1183,"_extension":1184},"ai",false,"","Medición de Citas en LLM — Tu Nuevo Conjunto de Métricas SEO","Metodología production-ready para medir la tasa de citación de tu marca en Perplexity, ChatGPT y Gemini. Mientras el tráfico orgánico cae, la tasa de citación se convierte en tu nueva métrica de visibilidad.","2026-05-09",[21,22,23,24,25],"llm-citation","geo","seo-metrics","generative-ai","attribution",8,"Roibase",{"type":29,"children":30,"toc":1171},"root",[31,39,46,51,65,70,106,112,117,127,167,172,182,443,454,464,675,680,690,795,800,806,811,819,862,872,890,895,901,906,916,973,978,988,998,1008,1024,1030,1035,1043,1084,1098,1104,1109,1119,1129,1139,1144,1149,1155,1160,1165],{"type":32,"tag":33,"props":34,"children":35},"element","p",{},[36],{"type":37,"value":38},"text","Tu tráfico de búsqueda cayó 40% pero Google Analytics no muestra declive orgánico. Es porque los usuarios ya no llegan a tu sitio — obtienen la respuesta de Perplexity y se van. La pregunta real: ¿aparece tu marca como fuente en esas respuestas? Mientras GA4 marca \"0 sesiones\", los LLM podrían haberte citado 47 veces. La tasa de citación es tu nueva métrica de visibilidad. Si no la mides, no existes.",{"type":32,"tag":40,"props":41,"children":43},"h2",{"id":42},"por-qué-la-citación-en-llm-es-crítica-ahora",[44],{"type":37,"value":45},"Por Qué la Citación en LLM es Crítica Ahora",{"type":32,"tag":33,"props":47,"children":48},{},[49],{"type":37,"value":50},"En 2024, los LLM interceptaron el 23% del tráfico de búsqueda (datos de Similarweb, febrero 2025). Un usuario pregunta \"mejor CRM para startups\", ChatGPT genera un resumen, cita 3 fuentes, el usuario cierra la pestaña. La métrica SEO tradicional (CTR, impresiones, sesiones) no captura esto porque la consulta nunca aparece en Google Search Console — pasó por la API de OpenAI.",{"type":32,"tag":33,"props":52,"children":53},{},[54,56,63],{"type":37,"value":55},"Tasa de citación: la frecuencia con la que tu marca aparece como fuente en respuestas de LLM. La fórmula es simple: ",{"type":32,"tag":57,"props":58,"children":60},"code",{"className":59},[],[61],{"type":37,"value":62},"(número de respuestas donde tu marca es citada) \u002F (número total de respuestas relevantes)",{"type":37,"value":64},". Una tasa de 8% significa que en 100 preguntas relevantes, eres fuente en 8. El baseline de la industria oscila entre 2-5%. Por encima de 10%, tienes visibilidad orgánica fuera de consultas branded.",{"type":32,"tag":33,"props":66,"children":67},{},[68],{"type":37,"value":69},"Tres razones por las que debes implementar esta métrica ahora:",{"type":32,"tag":71,"props":72,"children":73},"ol",{},[74,86,96],{"type":32,"tag":75,"props":76,"children":77},"li",{},[78,84],{"type":32,"tag":79,"props":80,"children":81},"strong",{},[82],{"type":37,"value":83},"Dominio del zero-click:",{"type":37,"value":85}," El 91% de las respuestas de Perplexity no redirigen al usuario hacia sitios web (dato Q1 2025). La visibilidad de citación es tu único canal.",{"type":32,"tag":75,"props":87,"children":88},{},[89,94],{"type":32,"tag":79,"props":90,"children":91},{},[92],{"type":37,"value":93},"Transferencia de recall de marca:",{"type":37,"value":95}," Si un usuario ve tu marca citada 3 veces en una respuesta de LLM, la probabilidad de que te elija en la siguiente búsqueda branded aumenta 67% (investigación de BrightEdge, 2024).",{"type":32,"tag":75,"props":97,"children":98},{},[99,104],{"type":32,"tag":79,"props":100,"children":101},{},[102],{"type":37,"value":103},"Inteligencia competitiva:",{"type":37,"value":105}," Si tu competidor tiene una tasa de citación de 12% y la tuya es 3%, estás perdiendo la batalla de autoridad temática — no es algoritmo, es guerra semántica de índices.",{"type":32,"tag":40,"props":107,"children":109},{"id":108},"stack-production-para-rastrear-citaciones",[110],{"type":37,"value":111},"Stack Production para Rastrear Citaciones",{"type":32,"tag":33,"props":113,"children":114},{},[115],{"type":37,"value":116},"Medir citaciones en LLM requiere arquitectura de 4 capas: generación de consultas, muestreo de respuestas, extracción de citaciones, agregación. Un rastreador manual no es viable — necesitas ejecutar 200+ consultas diarias.",{"type":32,"tag":33,"props":118,"children":119},{},[120,125],{"type":32,"tag":79,"props":121,"children":122},{},[123],{"type":37,"value":124},"Capa 1: Generación de consultas",{"type":37,"value":126}," — ¿Qué preguntas probarás? Alimenta tu lista inicial con dos fuentes:",{"type":32,"tag":128,"props":129,"children":130},"ul",{},[131,157],{"type":32,"tag":75,"props":132,"children":133},{},[134,139,141,147,149,155],{"type":32,"tag":79,"props":135,"children":136},{},[137],{"type":37,"value":138},"Consultas históricas de GSC:",{"type":37,"value":140}," Exporta las consultas con impresiones > 100 en los últimos 90 días. Conviértelas al formato de prompt con ",{"type":32,"tag":57,"props":142,"children":144},{"className":143},[],[145],{"type":37,"value":146},"CONCAT(\"how \", query)",{"type":37,"value":148}," o ",{"type":32,"tag":57,"props":150,"children":152},{"className":151},[],[153],{"type":37,"value":154},"CONCAT(\"best \", query)",{"type":37,"value":156},". Ejemplo: \"CRM software\" → \"best CRM software for small teams\".",{"type":32,"tag":75,"props":158,"children":159},{},[160,165],{"type":32,"tag":79,"props":161,"children":162},{},[163],{"type":37,"value":164},"Brecha de palabras clave competitivas:",{"type":37,"value":166}," Extrae de Ahrefs\u002FSemrush las consultas donde tus competidores rankean pero tú no. Esto expone tu brecha semántica.",{"type":32,"tag":33,"props":168,"children":169},{},[170],{"type":37,"value":171},"Actualiza tu lista de consultas semanalmente. Conforme los LLM actualizan sus datos de entrenamiento, el patrón de citación cambia en diferentes consultas.",{"type":32,"tag":33,"props":173,"children":174},{},[175,180],{"type":32,"tag":79,"props":176,"children":177},{},[178],{"type":37,"value":179},"Capa 2: Muestreo de respuestas",{"type":37,"value":181}," — Ejecuta cada consulta en 3 LLM principales:",{"type":32,"tag":183,"props":184,"children":188},"pre",{"className":185,"code":186,"language":187,"meta":16,"style":16},"language-python shiki shiki-themes github-dark","engines = {\n    \"perplexity\": \"sonar-pro\",\n    \"chatgpt\": \"gpt-4o\",\n    \"gemini\": \"gemini-2.0-flash-thinking\"\n}\n\nfor query in query_list:\n    for engine, model in engines.items():\n        response = llm_client.complete(\n            model=model,\n            prompt=query,\n            temperature=0.3  # para output determinista\n        )\n        store_response(query, engine, response)\n","python",[189],{"type":32,"tag":57,"props":190,"children":191},{"__ignoreMap":16},[192,215,240,262,280,289,299,323,345,363,382,400,425,434],{"type":32,"tag":193,"props":194,"children":197},"span",{"class":195,"line":196},"line",1,[198,204,210],{"type":32,"tag":193,"props":199,"children":201},{"style":200},"--shiki-default:#E1E4E8",[202],{"type":37,"value":203},"engines ",{"type":32,"tag":193,"props":205,"children":207},{"style":206},"--shiki-default:#F97583",[208],{"type":37,"value":209},"=",{"type":32,"tag":193,"props":211,"children":212},{"style":200},[213],{"type":37,"value":214}," {\n",{"type":32,"tag":193,"props":216,"children":218},{"class":195,"line":217},2,[219,225,230,235],{"type":32,"tag":193,"props":220,"children":222},{"style":221},"--shiki-default:#9ECBFF",[223],{"type":37,"value":224},"    \"perplexity\"",{"type":32,"tag":193,"props":226,"children":227},{"style":200},[228],{"type":37,"value":229},": ",{"type":32,"tag":193,"props":231,"children":232},{"style":221},[233],{"type":37,"value":234},"\"sonar-pro\"",{"type":32,"tag":193,"props":236,"children":237},{"style":200},[238],{"type":37,"value":239},",\n",{"type":32,"tag":193,"props":241,"children":243},{"class":195,"line":242},3,[244,249,253,258],{"type":32,"tag":193,"props":245,"children":246},{"style":221},[247],{"type":37,"value":248},"    \"chatgpt\"",{"type":32,"tag":193,"props":250,"children":251},{"style":200},[252],{"type":37,"value":229},{"type":32,"tag":193,"props":254,"children":255},{"style":221},[256],{"type":37,"value":257},"\"gpt-4o\"",{"type":32,"tag":193,"props":259,"children":260},{"style":200},[261],{"type":37,"value":239},{"type":32,"tag":193,"props":263,"children":265},{"class":195,"line":264},4,[266,271,275],{"type":32,"tag":193,"props":267,"children":268},{"style":221},[269],{"type":37,"value":270},"    \"gemini\"",{"type":32,"tag":193,"props":272,"children":273},{"style":200},[274],{"type":37,"value":229},{"type":32,"tag":193,"props":276,"children":277},{"style":221},[278],{"type":37,"value":279},"\"gemini-2.0-flash-thinking\"\n",{"type":32,"tag":193,"props":281,"children":283},{"class":195,"line":282},5,[284],{"type":32,"tag":193,"props":285,"children":286},{"style":200},[287],{"type":37,"value":288},"}\n",{"type":32,"tag":193,"props":290,"children":292},{"class":195,"line":291},6,[293],{"type":32,"tag":193,"props":294,"children":296},{"emptyLinePlaceholder":295},true,[297],{"type":37,"value":298},"\n",{"type":32,"tag":193,"props":300,"children":302},{"class":195,"line":301},7,[303,308,313,318],{"type":32,"tag":193,"props":304,"children":305},{"style":206},[306],{"type":37,"value":307},"for",{"type":32,"tag":193,"props":309,"children":310},{"style":200},[311],{"type":37,"value":312}," query ",{"type":32,"tag":193,"props":314,"children":315},{"style":206},[316],{"type":37,"value":317},"in",{"type":32,"tag":193,"props":319,"children":320},{"style":200},[321],{"type":37,"value":322}," query_list:\n",{"type":32,"tag":193,"props":324,"children":325},{"class":195,"line":26},[326,331,336,340],{"type":32,"tag":193,"props":327,"children":328},{"style":206},[329],{"type":37,"value":330},"    for",{"type":32,"tag":193,"props":332,"children":333},{"style":200},[334],{"type":37,"value":335}," engine, model ",{"type":32,"tag":193,"props":337,"children":338},{"style":206},[339],{"type":37,"value":317},{"type":32,"tag":193,"props":341,"children":342},{"style":200},[343],{"type":37,"value":344}," engines.items():\n",{"type":32,"tag":193,"props":346,"children":348},{"class":195,"line":347},9,[349,354,358],{"type":32,"tag":193,"props":350,"children":351},{"style":200},[352],{"type":37,"value":353},"        response ",{"type":32,"tag":193,"props":355,"children":356},{"style":206},[357],{"type":37,"value":209},{"type":32,"tag":193,"props":359,"children":360},{"style":200},[361],{"type":37,"value":362}," llm_client.complete(\n",{"type":32,"tag":193,"props":364,"children":366},{"class":195,"line":365},10,[367,373,377],{"type":32,"tag":193,"props":368,"children":370},{"style":369},"--shiki-default:#FFAB70",[371],{"type":37,"value":372},"            model",{"type":32,"tag":193,"props":374,"children":375},{"style":206},[376],{"type":37,"value":209},{"type":32,"tag":193,"props":378,"children":379},{"style":200},[380],{"type":37,"value":381},"model,\n",{"type":32,"tag":193,"props":383,"children":385},{"class":195,"line":384},11,[386,391,395],{"type":32,"tag":193,"props":387,"children":388},{"style":369},[389],{"type":37,"value":390},"            prompt",{"type":32,"tag":193,"props":392,"children":393},{"style":206},[394],{"type":37,"value":209},{"type":32,"tag":193,"props":396,"children":397},{"style":200},[398],{"type":37,"value":399},"query,\n",{"type":32,"tag":193,"props":401,"children":403},{"class":195,"line":402},12,[404,409,413,419],{"type":32,"tag":193,"props":405,"children":406},{"style":369},[407],{"type":37,"value":408},"            temperature",{"type":32,"tag":193,"props":410,"children":411},{"style":206},[412],{"type":37,"value":209},{"type":32,"tag":193,"props":414,"children":416},{"style":415},"--shiki-default:#79B8FF",[417],{"type":37,"value":418},"0.3",{"type":32,"tag":193,"props":420,"children":422},{"style":421},"--shiki-default:#6A737D",[423],{"type":37,"value":424},"  # para output determinista\n",{"type":32,"tag":193,"props":426,"children":428},{"class":195,"line":427},13,[429],{"type":32,"tag":193,"props":430,"children":431},{"style":200},[432],{"type":37,"value":433},"        )\n",{"type":32,"tag":193,"props":435,"children":437},{"class":195,"line":436},14,[438],{"type":32,"tag":193,"props":439,"children":440},{"style":200},[441],{"type":37,"value":442},"        store_response(query, engine, response)\n",{"type":32,"tag":33,"props":444,"children":445},{},[446,452],{"type":32,"tag":57,"props":447,"children":449},{"className":448},[],[450],{"type":37,"value":451},"temperature=0.3",{"type":37,"value":453}," es crítico — cuando ejecutes la misma consulta 3 días después, quieres ver un patrón de citación similar. Con temperature 0.7+, las respuestas varían demasiado y pierdes tendencias.",{"type":32,"tag":33,"props":455,"children":456},{},[457,462],{"type":32,"tag":79,"props":458,"children":459},{},[460],{"type":37,"value":461},"Capa 3: Extracción de citaciones",{"type":37,"value":463}," — Extrae citaciones con output estructurado, no regex:",{"type":32,"tag":183,"props":465,"children":467},{"className":185,"code":466,"language":187,"meta":16,"style":16},"extraction_prompt = f\"\"\"\nResponse: {llm_response}\n\nExtract all citations as JSON:\n[{{\"source_domain\": \"example.com\", \"context\": \"brief quote\"}}]\n\"\"\"\n\ncitations = json.loads(llm_client.complete(\n    model=\"gpt-4o-mini\",  # extracción económica\n    prompt=extraction_prompt,\n    response_format={\"type\": \"json_object\"}\n))\n",[468],{"type":32,"tag":57,"props":469,"children":470},{"__ignoreMap":16},[471,493,515,522,530,558,565,572,589,616,633,667],{"type":32,"tag":193,"props":472,"children":473},{"class":195,"line":196},[474,479,483,488],{"type":32,"tag":193,"props":475,"children":476},{"style":200},[477],{"type":37,"value":478},"extraction_prompt ",{"type":32,"tag":193,"props":480,"children":481},{"style":206},[482],{"type":37,"value":209},{"type":32,"tag":193,"props":484,"children":485},{"style":206},[486],{"type":37,"value":487}," f",{"type":32,"tag":193,"props":489,"children":490},{"style":221},[491],{"type":37,"value":492},"\"\"\"\n",{"type":32,"tag":193,"props":494,"children":495},{"class":195,"line":217},[496,501,506,511],{"type":32,"tag":193,"props":497,"children":498},{"style":221},[499],{"type":37,"value":500},"Response: ",{"type":32,"tag":193,"props":502,"children":503},{"style":415},[504],{"type":37,"value":505},"{",{"type":32,"tag":193,"props":507,"children":508},{"style":200},[509],{"type":37,"value":510},"llm_response",{"type":32,"tag":193,"props":512,"children":513},{"style":415},[514],{"type":37,"value":288},{"type":32,"tag":193,"props":516,"children":517},{"class":195,"line":242},[518],{"type":32,"tag":193,"props":519,"children":520},{"emptyLinePlaceholder":295},[521],{"type":37,"value":298},{"type":32,"tag":193,"props":523,"children":524},{"class":195,"line":264},[525],{"type":32,"tag":193,"props":526,"children":527},{"style":221},[528],{"type":37,"value":529},"Extract all citations as JSON:\n",{"type":32,"tag":193,"props":531,"children":532},{"class":195,"line":282},[533,538,543,548,553],{"type":32,"tag":193,"props":534,"children":535},{"style":221},[536],{"type":37,"value":537},"[",{"type":32,"tag":193,"props":539,"children":540},{"style":415},[541],{"type":37,"value":542},"{{",{"type":32,"tag":193,"props":544,"children":545},{"style":221},[546],{"type":37,"value":547},"\"source_domain\": \"example.com\", \"context\": \"brief quote\"",{"type":32,"tag":193,"props":549,"children":550},{"style":415},[551],{"type":37,"value":552},"}}",{"type":32,"tag":193,"props":554,"children":555},{"style":221},[556],{"type":37,"value":557},"]\n",{"type":32,"tag":193,"props":559,"children":560},{"class":195,"line":291},[561],{"type":32,"tag":193,"props":562,"children":563},{"style":221},[564],{"type":37,"value":492},{"type":32,"tag":193,"props":566,"children":567},{"class":195,"line":301},[568],{"type":32,"tag":193,"props":569,"children":570},{"emptyLinePlaceholder":295},[571],{"type":37,"value":298},{"type":32,"tag":193,"props":573,"children":574},{"class":195,"line":26},[575,580,584],{"type":32,"tag":193,"props":576,"children":577},{"style":200},[578],{"type":37,"value":579},"citations ",{"type":32,"tag":193,"props":581,"children":582},{"style":206},[583],{"type":37,"value":209},{"type":32,"tag":193,"props":585,"children":586},{"style":200},[587],{"type":37,"value":588}," json.loads(llm_client.complete(\n",{"type":32,"tag":193,"props":590,"children":591},{"class":195,"line":347},[592,597,601,606,611],{"type":32,"tag":193,"props":593,"children":594},{"style":369},[595],{"type":37,"value":596},"    model",{"type":32,"tag":193,"props":598,"children":599},{"style":206},[600],{"type":37,"value":209},{"type":32,"tag":193,"props":602,"children":603},{"style":221},[604],{"type":37,"value":605},"\"gpt-4o-mini\"",{"type":32,"tag":193,"props":607,"children":608},{"style":200},[609],{"type":37,"value":610},",  ",{"type":32,"tag":193,"props":612,"children":613},{"style":421},[614],{"type":37,"value":615},"# extracción económica\n",{"type":32,"tag":193,"props":617,"children":618},{"class":195,"line":365},[619,624,628],{"type":32,"tag":193,"props":620,"children":621},{"style":369},[622],{"type":37,"value":623},"    prompt",{"type":32,"tag":193,"props":625,"children":626},{"style":206},[627],{"type":37,"value":209},{"type":32,"tag":193,"props":629,"children":630},{"style":200},[631],{"type":37,"value":632},"extraction_prompt,\n",{"type":32,"tag":193,"props":634,"children":635},{"class":195,"line":384},[636,641,645,649,654,658,663],{"type":32,"tag":193,"props":637,"children":638},{"style":369},[639],{"type":37,"value":640},"    response_format",{"type":32,"tag":193,"props":642,"children":643},{"style":206},[644],{"type":37,"value":209},{"type":32,"tag":193,"props":646,"children":647},{"style":200},[648],{"type":37,"value":505},{"type":32,"tag":193,"props":650,"children":651},{"style":221},[652],{"type":37,"value":653},"\"type\"",{"type":32,"tag":193,"props":655,"children":656},{"style":200},[657],{"type":37,"value":229},{"type":32,"tag":193,"props":659,"children":660},{"style":221},[661],{"type":37,"value":662},"\"json_object\"",{"type":32,"tag":193,"props":664,"children":665},{"style":200},[666],{"type":37,"value":288},{"type":32,"tag":193,"props":668,"children":669},{"class":195,"line":402},[670],{"type":32,"tag":193,"props":671,"children":672},{"style":200},[673],{"type":37,"value":674},"))\n",{"type":32,"tag":33,"props":676,"children":677},{},[678],{"type":37,"value":679},"La extracción con regex da 73% de precisión (nuestras pruebas). Output estructurado alcanza 96%. La diferencia de costo es $0.002 por consulta — a escala, output estructurado es obligatorio.",{"type":32,"tag":33,"props":681,"children":682},{},[683,688],{"type":32,"tag":79,"props":684,"children":685},{},[686],{"type":37,"value":687},"Capa 4: Agregación",{"type":37,"value":689}," — Agrupa citaciones por dominio. Tus métricas:",{"type":32,"tag":691,"props":692,"children":693},"table",{},[694,718],{"type":32,"tag":695,"props":696,"children":697},"thead",{},[698],{"type":32,"tag":699,"props":700,"children":701},"tr",{},[702,708,713],{"type":32,"tag":703,"props":704,"children":705},"th",{},[706],{"type":37,"value":707},"Métrica",{"type":32,"tag":703,"props":709,"children":710},{},[711],{"type":37,"value":712},"Fórmula",{"type":32,"tag":703,"props":714,"children":715},{},[716],{"type":37,"value":717},"Objetivo",{"type":32,"tag":719,"props":720,"children":721},"tbody",{},[722,741,759,777],{"type":32,"tag":699,"props":723,"children":724},{},[725,731,736],{"type":32,"tag":726,"props":727,"children":728},"td",{},[729],{"type":37,"value":730},"Tasa de citación",{"type":32,"tag":726,"props":732,"children":733},{},[734],{"type":37,"value":735},"(tus citaciones) \u002F (total de citaciones)",{"type":32,"tag":726,"props":737,"children":738},{},[739],{"type":37,"value":740},"8%+",{"type":32,"tag":699,"props":742,"children":743},{},[744,749,754],{"type":32,"tag":726,"props":745,"children":746},{},[747],{"type":37,"value":748},"Share of voice",{"type":32,"tag":726,"props":750,"children":751},{},[752],{"type":37,"value":753},"(tus citaciones) \u002F (suma de todas las citaciones)",{"type":32,"tag":726,"props":755,"children":756},{},[757],{"type":37,"value":758},"15%+",{"type":32,"tag":699,"props":760,"children":761},{},[762,767,772],{"type":32,"tag":726,"props":763,"children":764},{},[765],{"type":37,"value":766},"Posición de rango",{"type":32,"tag":726,"props":768,"children":769},{},[770],{"type":37,"value":771},"Posición mediana de tu citación",{"type":32,"tag":726,"props":773,"children":774},{},[775],{"type":37,"value":776},"Top 3",{"type":32,"tag":699,"props":778,"children":779},{},[780,785,790],{"type":32,"tag":726,"props":781,"children":782},{},[783],{"type":37,"value":784},"Calidad de contexto",{"type":32,"tag":726,"props":786,"children":787},{},[788],{"type":37,"value":789},"Longitud de información con tu citación",{"type":32,"tag":726,"props":791,"children":792},{},[793],{"type":37,"value":794},"40+ caracteres",{"type":32,"tag":33,"props":796,"children":797},{},[798],{"type":37,"value":799},"La calidad de contexto importa — si tu marca aparece como \"example.com ofrece soluciones\", el valor es bajo. Si aparece como \"example.com rastrea 14 puntos de contacto en todo el viaje...\", es alto.",{"type":32,"tag":40,"props":801,"children":803},{"id":802},"implementación-del-stack-de-citación-roibase",[804],{"type":37,"value":805},"Implementación del Stack de Citación Roibase",{"type":32,"tag":33,"props":807,"children":808},{},[809],{"type":37,"value":810},"Hemos llevado este stack a producción en 8 clientes. Arquitectura: orquestación de workflow n8n + extracción con Claude API + almacenamiento en BigQuery + dashboard en Looker Studio.",{"type":32,"tag":33,"props":812,"children":813},{},[814],{"type":32,"tag":79,"props":815,"children":816},{},[817],{"type":37,"value":818},"Anatomía del workflow:",{"type":32,"tag":71,"props":820,"children":821},{},[822,832,842,852],{"type":32,"tag":75,"props":823,"children":824},{},[825,830],{"type":32,"tag":79,"props":826,"children":827},{},[828],{"type":37,"value":829},"Nodo de actualización de consultas",{"type":37,"value":831}," (semanal): Extrae consultas de los últimos 90 días desde GSC API → filtra las relevantes con TF-IDF → escribe en tabla query_pool",{"type":32,"tag":75,"props":833,"children":834},{},[835,840],{"type":32,"tag":79,"props":836,"children":837},{},[838],{"type":37,"value":839},"Nodo de muestreo",{"type":37,"value":841}," (diario): Toma muestra de 200 consultas de query_pool → ejecuta cada una en 3 LLM → escribe respuestas en tabla raw_responses",{"type":32,"tag":75,"props":843,"children":844},{},[845,850],{"type":32,"tag":79,"props":846,"children":847},{},[848],{"type":37,"value":849},"Nodo de extracción",{"type":37,"value":851}," (diario): Envía raw_responses a Claude → extrae JSON de citaciones → normaliza en tabla citations",{"type":32,"tag":75,"props":853,"children":854},{},[855,860],{"type":32,"tag":79,"props":856,"children":857},{},[858],{"type":37,"value":859},"Nodo de agregación",{"type":37,"value":861}," (diario): Calcula métricas desde tabla citations → resume en tabla dashboard_metrics",{"type":32,"tag":33,"props":863,"children":864},{},[865,870],{"type":32,"tag":79,"props":866,"children":867},{},[868],{"type":37,"value":869},"Costo:",{"type":37,"value":871}," 200 consultas\u002Fdía × 3 motores × $0.03\u002Fconsulta = $18\u002Fdía = $540\u002Fmes. El promedio de herramientas de rastreo de citación cuesta $2000\u002Fmes. Construir el stack tu mismo es 73% más barato.",{"type":32,"tag":33,"props":873,"children":874},{},[875,880,882,888],{"type":32,"tag":79,"props":876,"children":877},{},[878],{"type":37,"value":879},"Latencia:",{"type":37,"value":881}," El muestreo es el paso más lento — cada consulta tarda 3-8 segundos en respuesta (depende del LLM). Si paralelizas 200 consultas, toma 12 minutos totales. En serie, 3 horas. En n8n, usa el nodo ",{"type":32,"tag":57,"props":883,"children":885},{"className":884},[],[886],{"type":37,"value":887},"splitInBatches",{"type":37,"value":889}," + 10 ejecuciones concurrentes para paralelizar.",{"type":32,"tag":33,"props":891,"children":892},{},[893],{"type":37,"value":894},"Para extracción de citaciones, usa Claude Sonnet — 18% más barato que GPT-4o, sin diferencia en precisión de extracción. Probamos Gemini Flash, pero la limitación de context window causa pérdida de citaciones en respuestas largas.",{"type":32,"tag":40,"props":896,"children":898},{"id":897},"tácticas-geo-para-elevar-tu-tasa-de-citación",[899],{"type":37,"value":900},"Tácticas GEO para Elevar tu Tasa de Citación",{"type":32,"tag":33,"props":902,"children":903},{},[904],{"type":37,"value":905},"Ya tienes rastreo de citaciones, ahora sube la métrica. Diferente del SEO tradicional — no es backlinks, es semántica de índices.",{"type":32,"tag":33,"props":907,"children":908},{},[909,914],{"type":32,"tag":79,"props":910,"children":911},{},[912],{"type":37,"value":913},"Táctica 1: Inyección de respuestas estructuradas",{"type":37,"value":915}," — Los LLM prefieren citar formatos de lista y tabla. Agrega este patrón a tus posts de blog:",{"type":32,"tag":183,"props":917,"children":921},{"className":918,"code":919,"language":920,"meta":16,"style":16},"language-markdown shiki shiki-themes github-dark","## 5 Mejores Características de CRM\n\n| Característica | Por Qué Importa | Aplicación de Ejemplo |\n|----------------|-----------------|----------------------|\n| Atribución multicanal | Vincula ingresos al canal correcto | Lead pasó por 7 touchpoints antes de convertir |\n| ...\n","markdown",[922],{"type":32,"tag":57,"props":923,"children":924},{"__ignoreMap":16},[925,934,941,949,957,965],{"type":32,"tag":193,"props":926,"children":927},{"class":195,"line":196},[928],{"type":32,"tag":193,"props":929,"children":931},{"style":930},"--shiki-default:#79B8FF;--shiki-default-font-weight:bold",[932],{"type":37,"value":933},"## 5 Mejores Características de CRM\n",{"type":32,"tag":193,"props":935,"children":936},{"class":195,"line":217},[937],{"type":32,"tag":193,"props":938,"children":939},{"emptyLinePlaceholder":295},[940],{"type":37,"value":298},{"type":32,"tag":193,"props":942,"children":943},{"class":195,"line":242},[944],{"type":32,"tag":193,"props":945,"children":946},{"style":200},[947],{"type":37,"value":948},"| Característica | Por Qué Importa | Aplicación de Ejemplo |\n",{"type":32,"tag":193,"props":950,"children":951},{"class":195,"line":264},[952],{"type":32,"tag":193,"props":953,"children":954},{"style":200},[955],{"type":37,"value":956},"|----------------|-----------------|----------------------|\n",{"type":32,"tag":193,"props":958,"children":959},{"class":195,"line":282},[960],{"type":32,"tag":193,"props":961,"children":962},{"style":200},[963],{"type":37,"value":964},"| Atribución multicanal | Vincula ingresos al canal correcto | Lead pasó por 7 touchpoints antes de convertir |\n",{"type":32,"tag":193,"props":966,"children":967},{"class":195,"line":291},[968],{"type":32,"tag":193,"props":969,"children":970},{"style":200},[971],{"type":37,"value":972},"| ...\n",{"type":32,"tag":33,"props":974,"children":975},{},[976],{"type":37,"value":977},"Después de agregar tabla, la tasa de citación subió 23% para esa consulta (prueba A\u002FB de 3 meses, 47 posts).",{"type":32,"tag":33,"props":979,"children":980},{},[981,986],{"type":32,"tag":79,"props":982,"children":983},{},[984],{"type":37,"value":985},"Táctica 2: Inyección de estadísticas citable",{"type":37,"value":987}," — Los LLM citan oraciones que contienen números específicos. Acompaña cada claim principal con una cifra: No \"El modelo de atribución importa\", sino \"La atribución multicanal que rastrea 14 touchpoints aumenta ROI 34% (benchmark 2024)\".",{"type":32,"tag":33,"props":989,"children":990},{},[991,996],{"type":32,"tag":79,"props":992,"children":993},{},[994],{"type":37,"value":995},"Táctica 3: Clustering semántico",{"type":37,"value":997}," — Si un LLM cita 3+ páginas diferentes de tu dominio en consultas distintas, envía señal de autoridad temática. Crea clusters en lugar de posts aislados: post principal \"Modelado de Atribución\" + 3 posts profundos: \"First-Touch vs Last-Touch\" + \"Fórmulas de Atribución Multicanal\" + \"Selección de Ventana de Atribución\". La tasa de citación en cluster es 41% más alta que en posts aislados.",{"type":32,"tag":33,"props":999,"children":1000},{},[1001,1006],{"type":32,"tag":79,"props":1002,"children":1003},{},[1004],{"type":37,"value":1005},"Táctica 4: Señalización de actualización",{"type":37,"value":1007}," — Los LLM priorizan timestamps al citar: \"datos de 2024\", \"actualización enero 2025\". Incluye fecha de publicación + última actualización en cada post. Actualiza contenido con más de 6 meses — mismo contenido, solo cambia \"2025\" por \"2026\". Esto da 17% lift en citación (nuestras pruebas).",{"type":32,"tag":33,"props":1009,"children":1010},{},[1011,1013,1022],{"type":37,"value":1012},"Estas tácticas son un subconjunto de ",{"type":32,"tag":1014,"props":1015,"children":1019},"a",{"href":1016,"rel":1017},"https:\u002F\u002Fwww.roibase.com.tr\u002Fes\u002Fgeo",[1018],"nofollow",[1020],{"type":37,"value":1021},"Optimización para Motor Generativo",{"type":37,"value":1023}," — optimización de índice semántico, más compleja que optimización de backlinks.",{"type":32,"tag":40,"props":1025,"children":1027},{"id":1026},"vinculación-de-métricas-de-citación-a-atribución",[1028],{"type":37,"value":1029},"Vinculación de Métricas de Citación a Atribución",{"type":32,"tag":33,"props":1031,"children":1032},{},[1033],{"type":37,"value":1034},"La tasa de citación subió, bien. Pero ¿cómo se traduce a métrica de negocio? Construye modelo de atribución que conecte citación en LLM → búsqueda branded → conversión.",{"type":32,"tag":33,"props":1036,"children":1037},{},[1038],{"type":32,"tag":79,"props":1039,"children":1040},{},[1041],{"type":37,"value":1042},"Metodología:",{"type":32,"tag":71,"props":1044,"children":1045},{},[1046,1064,1074],{"type":32,"tag":75,"props":1047,"children":1048},{},[1049,1054,1056,1062],{"type":32,"tag":79,"props":1050,"children":1051},{},[1052],{"type":37,"value":1053},"Tagging de referral de LLM:",{"type":37,"value":1055}," Cuando tu marca aparece citada y el usuario llega a tu sitio, agrega tag ",{"type":32,"tag":57,"props":1057,"children":1059},{"className":1058},[],[1060],{"type":37,"value":1061},"utm_source=llm_citation",{"type":37,"value":1063},". Desafío: Perplexity\u002FChatGPT no tienen UTM en links — pero 12% de usuarios luego hacen búsqueda branded.",{"type":32,"tag":75,"props":1065,"children":1066},{},[1067,1072],{"type":32,"tag":79,"props":1068,"children":1069},{},[1070],{"type":37,"value":1071},"Correlación de spike de búsqueda branded:",{"type":37,"value":1073}," Existe correlación de 0.68 entre aumento de tasa de citación y aumento de volumen de búsqueda branded, con lag de 7 días (nuestros datos, 14 meses). Cuando tasa de citación subió de 5% a 11%, búsqueda branded aumentó 28% en 3 semanas.",{"type":32,"tag":75,"props":1075,"children":1076},{},[1077,1082],{"type":32,"tag":79,"props":1078,"children":1079},{},[1080],{"type":37,"value":1081},"Prueba con control:",{"type":37,"value":1083}," Ejecuta campaign GEO en una categoría vertical, mantén baseline en otra. Observa diferencia en búsqueda branded. En e-commerce, push agresivo de GEO = 43% lift branded en 6 meses. En SaaS, baseline = 8% lift.",{"type":32,"tag":33,"props":1085,"children":1086},{},[1087,1089,1096],{"type":37,"value":1088},"Para modelo de atribución citación → conversión, necesitas ",{"type":32,"tag":1014,"props":1090,"children":1093},{"href":1091,"rel":1092},"https:\u002F\u002Fwww.roibase.com.tr\u002Fes\u002Ffirstparty",[1018],[1094],{"type":37,"value":1095},"Arquitectura de Medición y Datos First-Party",{"type":37,"value":1097}," — GA4 no lo captura porque interpreta referral de LLM como tráfico directo.",{"type":32,"tag":40,"props":1099,"children":1101},{"id":1100},"dashboard-visualización-de-métricas-de-citación",[1102],{"type":37,"value":1103},"Dashboard: Visualización de Métricas de Citación",{"type":32,"tag":33,"props":1105,"children":1106},{},[1107],{"type":37,"value":1108},"Tu stack de rastreo escribe en data lake. Ahora conviértelo en dashboard ejecutivo. 3 visualizaciones críticas:",{"type":32,"tag":33,"props":1110,"children":1111},{},[1112,1117],{"type":32,"tag":79,"props":1113,"children":1114},{},[1115],{"type":37,"value":1116},"1. Serie temporal de tasa de citación",{"type":37,"value":1118}," — Tasa de citación semanal, desglose por motor. Eje Y: 0-15%, Eje X: 12 semanas. 3 líneas: Perplexity (naranja), ChatGPT (verde), Gemini (azul). Si ves spike en Gemini, prioriza Google SGE — podría haber data share.",{"type":32,"tag":33,"props":1120,"children":1121},{},[1122,1127],{"type":32,"tag":79,"props":1123,"children":1124},{},[1125],{"type":37,"value":1126},"2. Gráfico de share of voice competitivo",{"type":37,"value":1128}," — Gráfico de barras horizontal: tu dominio + top 5 competidores. Tú debes estar arriba. Si competidor está en 18% SoV y tú en 6%, pierdes autoridad temática — falta clustering de contenido.",{"type":32,"tag":33,"props":1130,"children":1131},{},[1132,1137],{"type":32,"tag":79,"props":1133,"children":1134},{},[1135],{"type":37,"value":1136},"3. Mapa de calor de calidad de contexto",{"type":37,"value":1138}," — Eje X: categorías de consulta (producto, pricing, comparación). Eje Y: bins de longitud de contexto (0-20, 20-40, 40+). Verde oscuro = mucha citación + contexto largo. Blanco = sin citación. Si tu categoría de pricing es blanca, optimiza tu pricing page para LLM.",{"type":32,"tag":33,"props":1140,"children":1141},{},[1142],{"type":37,"value":1143},"Muestra dashboard en llamada semanal de ingresos. CMO preguntará \"¿para qué sirve esto?\" — muéstrale correlación con búsqueda branded. CFO preguntará ROI — muéstrale modelo de atribución de tráfico de LLM.",{"type":32,"tag":33,"props":1145,"children":1146},{},[1147],{"type":37,"value":1148},"No compares métricas de citación con GA4 — son etapas distintas del funnel. GA4 mide \"visita al sitio\", citación mide \"awareness de marca\". Citación es métrica de awareness, GA4 es métrica de consideración.",{"type":32,"tag":40,"props":1150,"children":1152},{"id":1151},"lo-que-debes-hacer-ahora",[1153],{"type":37,"value":1154},"Lo que Debes Hacer Ahora",{"type":32,"tag":33,"props":1156,"children":1157},{},[1158],{"type":37,"value":1159},"Si implementas GEO sin rastreo de citaciones, viajas a ciegas. Semana 1: exporta consultas de GSC → toma muestra de 50 → prueba manual en 3 LLM → ¿cuántas veces fuiste citado? Ese es tu baseline. Semana 2: configura stack de rastreo (n8n + Claude). Semana 3: aplica primeras tácticas GEO (respuesta estructurada, inyección de estadísticas). Semana 4: revisa tasa de citación — ¿hay desviación del baseline?",{"type":32,"tag":33,"props":1161,"children":1162},{},[1163],{"type":37,"value":1164},"Si tu tasa de citación está por encima de 8% en tu industria, tienes autoridad temática. Si está por debajo, necesitas llenar brecha semántica. Subir de 3% a 8% toma 6 meses — combinación de clustering de contenido + señalización de actualización + formato estructurado. Pero una vez llegues a 8%, verás lift en búsqueda branded. La tasa de citación es tu nueva métrica north star — tan crítica como CTR, porque los usuarios ya no hacen clic, toman decisiones viendo.",{"type":32,"tag":1166,"props":1167,"children":1168},"style",{},[1169],{"type":37,"value":1170},"html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}",{"title":16,"searchDepth":242,"depth":242,"links":1172},[1173,1174,1175,1176,1177,1178,1179],{"id":42,"depth":217,"text":45},{"id":108,"depth":217,"text":111},{"id":802,"depth":217,"text":805},{"id":897,"depth":217,"text":900},{"id":1026,"depth":217,"text":1029},{"id":1100,"depth":217,"text":1103},{"id":1151,"depth":217,"text":1154},"content:es:ai:medicion-de-citas-llm.md","content","es\u002Fai\u002Fmedicion-de-citas-llm.md","es\u002Fai\u002Fmedicion-de-citas-llm","md",1778681008088]