{"id":22818,"date":"2025-01-31T14:46:05","date_gmt":"2025-01-31T14:46:05","guid":{"rendered":"https:\/\/algotrading101.com\/learn\/?p=22818"},"modified":"2025-02-04T21:45:51","modified_gmt":"2025-02-04T21:45:51","slug":"magentic-llm-guide","status":"publish","type":"post","link":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/","title":{"rendered":"How to build LLM Agents with Magentic"},"content":{"rendered":"\n<figure class=\"wp-block-image size-full\"><img fetchpriority=\"high\" decoding=\"async\" width=\"900\" height=\"675\" src=\"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail.webp\" alt=\"\" class=\"wp-image-22805\" srcset=\"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail.webp 900w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail-300x225.webp 300w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail-768x576.webp 768w\" sizes=\"(max-width: 900px) 100vw, 900px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Table of contents:<\/h3>\n\n\n\n<ol>\n<li><a href=\"#what-is-magentic\">What is Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-pros\">Why should I use Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-cons\">Why shouldn\u2019t I use Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-alternatives\">What are some Magentic alternatives?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-llms\">Which LLMs does Magentic support?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-start\">Getting started<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-prompt-decorator\">How to use Magentic prompt decorator?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-chatprompt-decorator\">How to use Magentic chatprompt decorator?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-function-calling\">How to do LLM function calling with Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-prompt-chains\">How to use Magentic prompt chains?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-stream-response\">How to stream LLM responses with Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-stream-structured-response\">How to stream structured LLM outputs with Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-async-stream\">How to async stream LLM outputs with Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-custom-rag\">How to build a custom RAG with Magentic?<\/a><\/li>\n\n\n\n<li><a href=\"#magentic-full-code\">Full code<\/a><\/li>\n<\/ol>\n\n\n\n<a name=\"what-is-magentic\">\n\n\n\n<h2 class=\"wp-block-heading\">What is Magentic?<\/h2>\n\n\n\n<p>Magentic is an open-source framework that allows for the seamless integration of Large Language Models (LLMs) into Python code.<\/p>\n\n\n\n<p>Website: <a href=\"https:\/\/magentic.dev\/\">Magentic<\/a><\/p>\n\n\n\n<p>GitHub Repository: <a href=\"https:\/\/github.com\/jackmpcollins\/magentic\">jackmpcollins\/magentic: Seamlessly integrate LLMs as Python functions<\/a><\/p>\n\n\n\n<a name=\"magentic-pros\">\n\n\n\n<h2 class=\"wp-block-heading\">Why should I use Magentic?<\/h2>\n\n\n\n<ul>\n<li>Magentic is easy to use.<\/li>\n\n\n\n<li>Magentic is free and open-sourced.<\/li>\n\n\n\n<li>It is actively maintained.<\/li>\n\n\n\n<li>Plays well with Pydantic to produce structured outputs.<\/li>\n\n\n\n<li>Supports vision, streaming, parallel function calling, and more.<\/li>\n<\/ul>\n\n\n\n<a name=\"magentic-cons\">\n\n\n\n<h2 class=\"wp-block-heading\">Why shouldn&#8217;t I use Magentic?<\/h2>\n\n\n\n<ul>\n<li>Magentic is mainly maintained by just one person.<\/li>\n\n\n\n<li>Magentic isn&#8217;t the only LLM framework.<\/li>\n\n\n\n<li>Doesn&#8217;t have a big community around it.<\/li>\n<\/ul>\n\n\n\n<a name=\"magentic-alternatives\">\n\n\n\n<h2 class=\"wp-block-heading\">What are some Magentic alternatives?<\/h2>\n\n\n\n<p>Magentic alternatives are the following:<\/p>\n\n\n\n<ul>\n<li><a href=\"https:\/\/langchain-ai.github.io\/langgraph\/\">LangGraph<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/docs.llamaindex.ai\/en\/stable\/\">LlamaIndex<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/microsoft.github.io\/autogen\/stable\/\">AutoGen<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.crewai.com\/\">Crewai<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/github.com\/deepset-ai\/haystack\">Haystack<\/a><\/li>\n<\/ul>\n\n\n\n<a name=\"magentic-llms\">\n\n\n\n<h2 class=\"wp-block-heading\">Which LLMs does Magentic support?<\/h2>\n\n\n\n<p>Magentic supports these LLMs:<\/p>\n\n\n\n<ul>\n<li><a href=\"https:\/\/openai.com\/\">OpenAI LLMs<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/ollama.com\/\">Ollama<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.anthropic.com\/\">Anthropic<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/mistral.ai\/\">Mistral<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/docs.litellm.ai\/\">LiteLLM<\/a><\/li>\n\n\n\n<li>or any other OpenAI schema-compatible model<\/li>\n<\/ul>\n\n\n\n<a name=\"magentic-start\">\n\n\n\n<h2 class=\"wp-block-heading\">Getting started<\/h2>\n\n\n\n<p>To get started with Magentic, ensure that you have installed Python and obtained an API key for one of the LLM models mentioned above. I will personally go with ChatGPT as the LLM of choice. <\/p>\n\n\n\n<p>The next step will be to install Magentic into a fresh environment with the following command:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>pip install magentic<\/code><\/pre><\/div>\n\n\n\n<p>Now, we will create a <code>.env<\/code> file in which we will add needed environment variables.<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-bash\" data-lang=\"Bash\"><code>touch .env<\/code><\/pre><\/div>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-plain\"><code>MAGENTIC_BACKEND=openai\nMAGENTIC_OPENAI_API_KEY=sk-...\nMAGENTIC_OPENAI_MODEL=gpt-4<\/code><\/pre><\/div>\n\n\n\n<p>In the following headers, we will go over the basic building blocks of Magentic and create a custom agentic RAG pipeline that will use financial data and function calling to provide outputs to the user query.<\/p>\n\n\n\n<p>Let&#8217;s begin.<\/p>\n\n\n\n<a name=\"magentic-prompt-decorator\">\n\n\n\n<h2 class=\"wp-block-heading\">How to use Magentic prompt decorator?<\/h2>\n\n\n\n<p>To use the Magentic &nbsp;<code>@prompt<\/code>&nbsp;decorator you need to define a template for a LLM prompt as a Python function. When this function is called, the arguments are inserted into the template and the prompt is sent to an LLM which generates the function output.<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>from magentic import prompt\n\n@prompt(&quot;Explain like I&#39;m five this financial concept: {concept}&quot;)\ndef explain(concept: str) -&gt; str: ...\n\nexplain(&quot;Subprime mortgage crisis&quot;)<\/code><\/pre><\/div>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">&#8220;The Subprime Mortgage Crisis is like a big, grown-up version of a game of hot potato. Here\u2019s how it works:<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">Let\u2019s say you want to buy an ice cream cone, but you don\u2019t have any money. So, a nice person (in this case, a bank) is willing to lend you money to buy your ice cream. But, they know you don\u2019t have any money to pay them back right away, which makes lending the money to you pretty risky. This would be a subprime mortgage, or loan.<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">Now, imagine that the person lending you money to buy your ice cream gets worried because they know you may not be able to pay them back. To get rid of this problem, they decide to sell your loan to another person (an investor). This means now you owe the money to this new person, not the one who lent you the money in the first place.<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">This makes the first person feel safe because they got their money back, but now the new person is the one who may lose money if you can&#8217;t pay them back.<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">Then, imagine this happening with not just one ice cream cone, but millions of them. Loads of people can\u2019t pay back their ice cream loans, so the new owners of these loans lose a lot of money. This big problem is like the Subprime Mortgage Crisis.<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">Importantly, it&#8217;s not really about ice cream, but about bigger expensive things like houses. When too many people couldn\u2019t pay back their loans, the banks and investors that owned those loans lost a lot of money, and this caused huge problems for the whole world&#8217;s money system.&#8221;<\/mark><\/p>\n\n\n\n<p>Notice how the <code>@prompt<\/code> decorator plays nicely with the function and curly braces. Moreover, the function doesn&#8217;t need to have a body as everything is done by the decorator.<\/p>\n\n\n\n<p>The&nbsp;<code>@prompt<\/code>&nbsp;decorator will respect the return type annotation of the decorated function. This can be&nbsp;<a href=\"https:\/\/docs.pydantic.dev\/latest\/usage\/types\/types\/\">any type supported by pydantic<\/a>&nbsp;including a&nbsp;<code>pydantic<\/code>&nbsp;model.<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>from magentic import prompt\nfrom pydantic import BaseModel\n\n\nclass Portfolio(BaseModel):\n    equity_etf_pct: float\n    bond_etf_pc: float\n    crypto_etf_pc: float\n    commodities_pc: float\n    reasoning: str\n\n\n@prompt(&quot;Create a strong portfolio of {size} allocation size.&quot;)\ndef create_portfolio(size: str) -&gt; Portfolio: ...\n\nportfolio = create_portfolio(&quot;$50,000&quot;)\nprint(portfolio)<\/code><\/pre><\/div>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">equity_etf_pct=50.0<br>bond_etf_pc=30.0<br>crypto_etf_pc=10.0<br>commodities_pc=10.0<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">reasoning=&#8217;A balanced strong portfolio suitable for most risk tolerances would allocate around 50% towards Equity ETFs for growth, 30% towards Bond ETFs for income and stability, 10% towards Crypto ETFs for high-growth and high-risk appetite and 10% towards commodities for a balanced protection against inflation. The allocation size is $50,000.&#8217;<\/mark><\/p>\n\n\n\n<a name=\"magentic-chatprompt-decorator\">\n\n\n\n<h2 class=\"wp-block-heading\">How to use Magentic chatprompt decorator?<\/h2>\n\n\n\n<p>To use the Magentic chatprompt decorator you will need to pass chat messages as a template rather than a single text prompt. We can also provide a system message or few-shot prompting example responses to guide the model&#8217;s output.&nbsp;<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>from magentic import chatprompt, AssistantMessage, SystemMessage, UserMessage\nfrom pydantic import BaseModel\n\n\nclass Quote(BaseModel):\n    quote: str\n    person: str\n\n\n@chatprompt(\n    SystemMessage(&quot;You are an avid reader of financial literature.&quot;),\n    UserMessage(&quot;What is your favorite quote from Warren Buffet?&quot;),\n    AssistantMessage(\n        Quote(\n            quote=&quot;Price is what you pay; value is what you get.&quot;,\n            person=&quot;Warren Buffet&quot;,\n        )\n    ),\n    UserMessage(&quot;What is your favorite quote from {person}?&quot;),\n)\ndef get_finance_quote(person: str) -&gt; Quote: ...\n\n\nget_finance_quote(&quot;Charlie Munger&quot;)<\/code><\/pre><\/div>\n\n\n\n<p>In my whole life, I have known no wise people (over a broad subject matter area) who didn\u2019t read all the time \u2013 none, zero.<\/p>\n\n\n\n<a name=\"magentic-function-calling\">\n\n\n\n<h2 class=\"wp-block-heading\">How to do LLM function calling with Magentic?<\/h2>\n\n\n\n<p>To do LLM function calling with Magentic, you will use a <code>@prompt<\/code>-decorated function that returns a&nbsp;<code>FunctionCall<\/code>&nbsp;object which can be called to execute the function using the arguments provided by the LLM. For example, let&#8217;s ask for price data and have it use Alpha Vantage:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>import os\nimport requests\nfrom magentic import prompt, FunctionCall\n\nAV_API_KEY = os.getenv(&quot;AV_API_KEY&quot;)\n\ndef get_daily_price(ticker: str, api_key: str = AV_API_KEY) -&gt; dict:\n    url = f&#39;https:\/\/www.alphavantage.co\/query?function=TIME_SERIES_DAILY&symbol={ticker}&apikey={api_key}&#39;\n    r = requests.get(url)\n    data = r.json()\n    return data[&#39;Time Series (Daily)&#39;]\n\n\n@prompt(\n    &quot;Use the appropriate search function to answer: {question}&quot;,\n    functions=[get_daily_price],\n)\ndef perform_search(question: str) -&gt; FunctionCall[str]: ...\n\n\noutput = perform_search(&quot;What is the daily price data of AAPL?&quot;)<\/code><\/pre><\/div>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>output()\n\n&gt;&gt;&gt; {&#39;2025-01-27&#39;: {&#39;1. open&#39;: &#39;224.1200&#39;,\n  &#39;2. high&#39;: &#39;232.1500&#39;,\n  &#39;3. low&#39;: &#39;224.0000&#39;,\n  &#39;4. close&#39;: &#39;229.8600&#39;,\n  &#39;5. volume&#39;: &#39;94224324&#39;}, ...<\/code><\/pre><\/div>\n\n\n\n<p>Take note that Alpha Vantage has a free tier so grabbing an API key shouldn&#8217;t be an issue if you are following along. Alternatively, you can use any other preferred data provider.<\/p>\n\n\n\n<a name=\"magentic-prompt-chains\">\n\n\n\n<h2 class=\"wp-block-heading\">How to use Magentic prompt chains?<\/h2>\n\n\n\n<p>You will want to use Magentic prompt chains when it is required for the LLM to do several operations before returning the final response. The&nbsp;<code>@prompt_chain<\/code>&nbsp;decorator will resolve&nbsp;<code>FunctionCall<\/code>&nbsp;objects automatically and pass the output back to the LLM to continue until the final answer is reached.<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>import csv\nfrom magentic import prompt_chain\n\n\ndef get_earnings_calendar(ticker: str, api_key: str = AV_API_KEY) -&gt; list:\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=EARNINGS_CALENDAR&symbol={ticker}&horizon=12month&apikey={api_key}&quot;\n    with requests.Session() as s:\n        download = s.get(url)\n        decoded_content = download.content.decode(&#39;utf-8&#39;)\n        cr = csv.reader(decoded_content.splitlines(), delimiter=&#39;,&#39;)\n        my_list = list(cr)\n    return my_list\n\n\n@prompt_chain(\n    &quot;What&#39;s {ticker} expected earnings dates for the next 12 months?&quot;,\n    functions=[get_earnings_calendar],\n)\ndef get_earnings(ticker: str) -&gt; str: ...\n\n\nget_earnings(&quot;IBM&quot;)<\/code><\/pre><\/div>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">&#8216;The expected earnings dates for IBM for the fiscal year 2025 are as follows:<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">&#8211; On April 22, 2025 for fiscal date ending March 31, 2025<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">&#8211; On July 22, 2025 for fiscal date ending June 30, 2025<\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">Please note that these dates are for fiscal year 2025 and the exact figures for the earnings are not available yet. These dates are expected and may change. The currency for these earnings is in USD.&#8217;<\/mark><\/p>\n\n\n\n<p>The prompt chains are the &#8220;bread and butter&#8221; of building agentic workflows as they allow us to create a loop where the LLM can perform multiple API calls until it collects all the information it needs to provide an answer.<\/p>\n\n\n\n<p>With some good prompting techniques and a good pipeline, one can perform many complex tasks.<\/p>\n\n\n\n<a name=\"magentic-stream-response\">\n\n\n\n<h2 class=\"wp-block-heading\">How to stream LLM responses with Magentic?<\/h2>\n\n\n\n<p>To stream LLM response with Magentic, you will use the&nbsp;<code>StreamedStr<\/code>&nbsp;(and&nbsp;<code>AsyncStreamedStr<\/code>) class. This allows you to process the text while it is being generated, rather than receiving the whole output at once. For example:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>from magentic import StreamedStr\n\n\n@prompt(&quot;Explain to me {term} in a way a 5-year-old would understand.&quot;)\ndef describe_finance_term(term: str) -&gt; StreamedStr: ...\n\n\n# Print the chunks while they are being received\nfor chunk in describe_finance_term(&quot;liquidity&quot;):\n    print(chunk, end=&quot;&quot;)<\/code><\/pre><\/div>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">&#8220;Liquidity is like having a toy that everyone wants to trade for. If you want to swap your toy for something else, you can do it easily and quickly because everyone wants your toy. That&#8217;s like having an asset or money that is liquid. You can exchange it easily and quickly without losing its value.&#8221;<\/mark><\/p>\n\n\n\n<p>You can now read the response as it is being generated and also inject it into other processes to speed things up.<\/p>\n\n\n\n<a name=\"magentic-stream-structured-response\">\n\n\n\n<h2 class=\"wp-block-heading\">How to stream structured LLM outputs with Magentic?<\/h2>\n\n\n\n<p>To stream structured LLM outputs with Magentic, you will utilize the return type annotation&nbsp;<code>Iterable<\/code>&nbsp;(or&nbsp;<code>AsyncIterable<\/code>). This allows each item to be processed while the next one is being generated. For example:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>from collections.abc import Iterable\nfrom time import time\n\n\nclass Portfolio(BaseModel):\n    equity_etf_pct: float\n    bond_etf_pc: float\n    crypto_etf_pc: float\n    commodities_pc: float\n    reasoning: str\n\n\n@prompt(&quot;Create {n_portfolio} portfolios with varying deegress of risk apetite.&quot;)\ndef create_portfolios(n_portfolio: int) -&gt; Iterable[Portfolio]: ...\n\n\nstart_time = time()\nfor portfolio in create_portfolios(3):\n    print(f&quot;{time() - start_time:.2f}s : {portfolio}&quot;)<\/code><\/pre><\/div>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">7.25s : equity_etf_pct=60.0 bond_etf_pc=30.0 crypto_etf_pc=5.0 commodities_pc=5.0 reasoning=&#8221;This portfolio is for an individual with a moderate risk tolerance. Having a larger part of the portfolio in equity ETF&#8217;s and bonds provides a balance of growth and stability. A small allocation is made to crypto and commodities in order to try and take advantage of potential high returns.&#8221; <\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">11.46s : equity_etf_pct=40.0 bond_etf_pc=55.0 crypto_etf_pc=2.0 commodities_pc=3.0 reasoning=&#8217;This portfolio is oriented towards an individual with a low risk tolerance. In this case, most of the portfolio is in bonds, which are generally lower risk. The allocation is also diversified with some investments in equity ETFs, crypto and commodities.&#8217; <\/mark><\/p>\n\n\n\n<p><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">14.78s : equity_etf_pct=70.0 bond_etf_pc=15.0 crypto_etf_pc=10.0 commodities_pc=5.0 reasoning=&#8221;This portfolio is for someone with a high risk tolerance. It allocates a majority of the holdings towards equity ETF&#8217;s for stronger growth potential. There is also increased investment in volatile areas such as cryptocurrency in the hope of achieving high returns.&#8221;<\/mark><\/p>\n\n\n\n<a name=\"magentic-async-stream\">\n\n\n\n<h2 class=\"wp-block-heading\">How to async stream LLM outputs with Magentic?<\/h2>\n\n\n\n<p>To async stream LLM outputs with Magentic, you will create <code>async <\/code>functions and utilize the <code>AsyncIterable<\/code> class where needed. This will allow us to concurrently query the LLM and increase the speed of generation and also allow other asynchronous code to run while waiting on LLM output.<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>import asyncio\nfrom typing import AsyncIterable\n\n\n@prompt(&quot;List three high-growth stocks.&quot;)\nasync def iter_growth_stocks() -&gt; AsyncIterable[str]: ...\n\n\n@prompt(&quot;Tell me more about {stock_symbol}&quot;)\nasync def tell_me_more_about(stock_symbol: str) -&gt; str: ...\n\n\nstart_time = time()\ntasks = []\nasync for stock in await iter_growth_stocks():\n    # Use asyncio.create_task to schedule the coroutine for execution before awaiting it\n    # This way descriptions will start being generated while the list of stocks is still being generated\n    task = asyncio.create_task(tell_me_more_about(stock))\n    tasks.append(task)\n\ndescriptions = await asyncio.gather(*tasks)\n\nfor desc in descriptions:\n    print(desc)<\/code><\/pre><\/div>\n\n\n\n<p>I will spare you the verbose descriptions but it mentioned Tesla, Zoom, and AMD.<\/p>\n\n\n\n<p>Okay, now that we have the main building blocks that magnetic offers, we can create our RAG (Retrieval-Augmented Generation) pipeline and build a small interface for it.<\/p>\n\n\n\n<a name=\"magentic-custom-rag\">\n\n\n\n<h2 class=\"wp-block-heading\">How to build a custom RAG with Magentic?<\/h2>\n\n\n\n<p>To build a custom <a href=\"https:\/\/www.nvidia.com\/en-us\/glossary\/retrieval-augmented-generation\/\" target=\"_blank\" rel=\"noreferrer noopener\">Retrieval Augmented Generation (RAG)<\/a> with Magentic, we will asynchronously use its building blocks. The overall idea will be that the RAG pipeline can perform investment analysis using the Alpha Vantage API.<\/p>\n\n\n\n<p>The goal is to dynamically determine which Alpha Vantage endpoints to call based on user queries, retrieve the necessary data, and then format it into a structured response.<\/p>\n\n\n\n<p>[1] <\/p>\n\n\n\n<p>We will achieve this by having 4 steps in our code that are these:<\/p>\n\n\n\n<p><strong>Function Selection<\/strong>: Determines which Alpha Vantage endpoints to use based on the input query.<\/p>\n\n\n\n<p><strong>Retrieval<\/strong>: Calls the selected functions to fetch relevant data.<\/p>\n\n\n\n<p><strong>Processing<\/strong>: Formats the retrieved data for LLM consumption.<\/p>\n\n\n\n<p><strong>Generation<\/strong>: LLM responds based on the gathered data.<\/p>\n\n\n\n<p><strong>Note: Keep in mind that this is a very simple and not-so-efficient pipeline. <\/strong><\/p>\n\n\n\n<p>For this, we will also want to use FastAPI as the framework of choice:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-git\" data-lang=\"Git\"><code>pip install fastapi\npip install uvicorn<\/code><\/pre><\/div>\n\n\n\n<p>Now, let us import the needed libraries and set up the FastAPI app, Alpha Vantage API key, and functions that will gather the data from Alpha Vantage:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>import csv\nimport os\nimport requests\nfrom typing import Any, AsyncGenerator\nfrom fastapi import FastAPI\nfrom fastapi.responses import StreamingResponse\nfrom magentic import (\n    AssistantMessage,\n    SystemMessage,\n    prompt,\n    chatprompt,\n    FunctionCall,\n    UserMessage,\n)\n\nAV_API_KEY = os.getenv(&quot;AV_API_KEY&quot;)\n\napp = FastAPI()\n\n\nasync def get_earnings_calendar(ticker: str, api_key: str = AV_API_KEY) -&gt; dict:\n    &quot;&quot;&quot;Fetches upcoming earnings dates for a given ticker.&quot;&quot;&quot;\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=EARNINGS_CALENDAR&symbol={ticker}&horizon=12month&apikey={api_key}&quot;\n    response = requests.get(url, timeout=30)\n    decoded_content = response.content.decode(&quot;utf-8&quot;)\n    cr = csv.reader(decoded_content.splitlines(), delimiter=&quot;,&quot;)\n    data = list(cr)\n    return {&quot;data&quot;: data}\n\n\nasync def get_news_sentiment(\n    ticker: str, limit: int = 5, api_key: str = AV_API_KEY\n) -&gt; list[dict]:\n    &quot;&quot;&quot;Fetches sentiment analysis on financial news related to the ticker.&quot;&quot;&quot;\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=NEWS_SENTIMENT&tickers={ticker}&apikey={api_key}&quot;\n    response = requests.get(url, timeout=30).json().get(&quot;feed&quot;, [])[:limit]\n    fields = [\n        &quot;time_published&quot;,\n        &quot;title&quot;,\n        &quot;summary&quot;,\n        &quot;topics&quot;,\n        &quot;overall_sentiment_score&quot;,\n        &quot;overall_sentiment_label&quot;,\n    ]\n    return [{field: article[field] for field in fields} for article in response]\n\n\nasync def get_daily_price(ticker: str, api_key: str = AV_API_KEY) -&gt; dict[str, Any]:\n    &quot;&quot;&quot;Fetches daily price data for a given stock ticker.&quot;&quot;&quot;\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=TIME_SERIES_DAILY&symbol={ticker}&apikey={api_key}&quot;\n    response = requests.get(url, timeout=30).json()\n    return response.get(&quot;Time Series (Daily)&quot;, {})\n\n\nasync def get_company_overview(\n    ticker: str, api_key: str = AV_API_KEY\n) -&gt; dict[str, Any]:\n    &quot;&quot;&quot;Fetches fundamental company data like market cap, P\/E ratio, and sector.&quot;&quot;&quot;\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=OVERVIEW&symbol={ticker}&apikey={api_key}&quot;\n    return requests.get(url, timeout=30).json()\n\n\nasync def get_sector_performance(api_key: str = AV_API_KEY) -&gt; dict[str, Any]:\n    &quot;&quot;&quot;Fetches market-wide sector performance data.&quot;&quot;&quot;\n    url = f&quot;https:\/\/www.alphavantage.co\/query?function=SECTOR&apikey={api_key}&quot;\n    return requests.get(url, timeout=30).json()<\/code><\/pre><\/div>\n\n\n\n<p>Something we should be careful about is how much data we are grabbing and passing to the LLM at its final stages. This is especially true when dealing with news-like data as it can easily overflow the context. I conservatively placed the limit to be 5 news sentiment pieces for now.<\/p>\n\n\n\n<p>There are strategies to this such as using compression, trimming, and\/or more agents that would summarise the payloads multiple times and similar.<\/p>\n\n\n\n<p>Now we will pass these functions into an LLM function that will choose the right data sources to answer the user&#8217;s questions:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>@prompt(\n    &quot;&quot;&quot;\n    You are an investment research assistant. \n    You need to answer the user&#39;s question: {question}\n    Use available functions to retrieve the data you need.\n    DO NOT request data from functions that have already been used!\n    If all necessary data has been retrieved, return `None`.\n    Here is what has already been retrieved: {called_functions}\n    &quot;&quot;&quot;,\n    functions=[\n        get_daily_price,\n        get_company_overview,\n        get_sector_performance,\n        get_news_sentiment,\n        get_earnings_calendar,\n    ],\n)\ndef iterative_search(\n    question: str, called_functions: set[str], chat_history: list[Any]\n) -&gt; FunctionCall[str] | None: ...<\/code><\/pre><\/div>\n\n\n\n<p>We also need an LLM function that will use the obtained data and provide an answer for it:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>@chatprompt(\n    SystemMessage(\n        &quot;&quot;&quot;\n        You are an investment research assistant. \n        Only use retrieved data for your analysis.\n        &quot;&quot;&quot;\n    ),\n    UserMessage(\n        &quot;You need to answer this question: {question}\\nAnalyze the following data: {collected_data}&quot;\n    ),\n)\ndef analyze_data(question: str, collected_data: dict[str, Any]) -&gt; str: ...<\/code><\/pre><\/div>\n\n\n\n<p>Now, all we need is the iteration loop where the querying logic goes:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>def format_collected_data(collected_data: dict[str, Any]) -&gt; str:\n    formatted_data = []\n    for function_name, data in collected_data.items():\n        formatted_data.append(f&quot;### {function_name} Data:\\n{data}\\n&quot;)\n    return &quot;\\n&quot;.join(formatted_data)\n\n\nasync def query(question: str, max_iterations: int = 10) -&gt; AsyncGenerator[str, None]:\n    &quot;&quot;&quot;\n    Runs iterative retrieval and streams LLM analysis.\n    &quot;&quot;&quot;\n    iteration = 0\n    collected_data = {}\n    called_functions = set()\n    chat_history = [\n        SystemMessage(\n            &quot;&quot;&quot;\n            You are an investment research assistant. \n            Retrieve data iteratively and update insights.\n            &quot;&quot;&quot;\n        )\n    ]\n\n    while iteration &lt; max_iterations:\n        iteration += 1\n        yield f&quot;\\n**Iteration {iteration}...**\\n&quot;\n\n        function_call = iterative_search(question, called_functions, chat_history)\n\n        if function_call is None:\n            yield &quot;\\n**LLM is satisfied with the data. Analyzing now...**\\n&quot;\n            break\n\n        function_name = function_call._function.__name__\n\n        if function_name in called_functions:\n            yield f&quot;\\n**Early stop: {function_name} was already called.**\\n&quot;\n            break\n\n        called_functions.add(function_name)\n        function_args = function_call.arguments\n\n        match function_name:\n            case &quot;get_daily_price&quot;:\n                result = await get_daily_price(**function_args)\n            case &quot;get_company_overview&quot;:\n                result = await get_company_overview(**function_args)\n            case &quot;get_sector_performance&quot;:\n                result = await get_sector_performance()\n            case &quot;get_news_sentiment&quot;:\n                result = await get_news_sentiment(**function_args)\n            case &quot;get_earnings_calendar&quot;:\n                result = await get_earnings_calendar(**function_args)\n            case _:\n                yield f&quot;\\nUnknown function requested: {function_name}\\n&quot;\n                continue\n\n        if not result:\n            yield f&quot;\\n**No new data found for {function_name}, stopping iteration.**\\n&quot;\n            break\n\n        collected_data[function_name] = result\n        yield f&quot;\\n**Retrieved data from {function_name}** \u2705\\n&quot;\n\n        chat_history.append(UserMessage(f&quot;Retrieved {function_name} data: {result}&quot;))\n        chat_history.append(AssistantMessage(f&quot;Storing data from {function_name}.&quot;))\n\n    formatted_data = format_collected_data(collected_data)\n    final_analysis = analyze_data(question, formatted_data)\n    yield f&quot;\\n**Investment Insight:**\\n{final_analysis}\\n&quot;<\/code><\/pre><\/div>\n\n\n\n<p>And now we wrap this behind a FastAPI endpoint:<\/p>\n\n\n\n<div class=\"hcb_wrap\"><pre class=\"prism line-numbers lang-python\" data-lang=\"Python\"><code>@app.get(&quot;\/investment_research&quot;)\nasync def investment_research(question: str):\n    return StreamingResponse(query(question), media_type=&quot;text\/event-stream&quot;)\n\n\nif __name__ == &quot;__main__&quot;:\n    import uvicorn\n\n    uvicorn.run(app, host=&quot;0.0.0.0&quot;, port=8000)<\/code><\/pre><\/div>\n\n\n\n<p>Once we start the uvicorn server, we can go over to the localhost:8000\/docs# URL address and try out the flow from there. You will see all of your available endpoints (in our case just one) and will be able to try them out. This also serves as a nice user interface.<\/p>\n\n\n\n<p>Let&#8217;s ask it to: &#8220;Tell me about the latest news and price trends of AAPL&#8221;<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"604\" src=\"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040-1024x604.png\" alt=\"\" class=\"wp-image-22808\" srcset=\"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040-1024x604.png 1024w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040-300x177.png 300w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040-768x453.png 768w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040-1536x905.png 1536w, https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/Screenshot-2025-01-31-152040.png 1581w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>We can see that the LLM grabbed both the price data and news sentiment data which was what we wanted it to do. Here is what it answered me:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>**Iteration 1...**\n\n**Retrieved data from get_daily_price** \u2705\n\n**Iteration 2...**\n\n**Retrieved data from get_news_sentiment** \u2705\n\n**Iteration 3...**\n\n**LLM is satisfied with the data. Analyzing now...**\n\n**Investment Insight:**\nUpon analyzing the given daily price and news sentiment data, here are the key trends and highlights:\n\nPrice Trends:\n- The closing price for Apple Inc. (AAPL) has significantly fluctuated over the past several months. \n- The highest closing price in our data set was on December 31, 2024, at $250.42, while the lowest closing price was on October 7, 2024, at $221.69.\n- There was a noticeable decline in the stock's price throughout January 2025 which suggests a bearish trend during this period. \n\nLatest News:\n1. Apple Inc. shares rose by 4.02% in pre-market following better-than-expected Q1 revenue and earnings per share. This news has a 'Somewhat-Bullish' sentiment with a score of 0.345283.\n2. Goldman Sachs increased Apple Inc.'s price target from $280 to $294, which could indicate a possible uptrend for the stock in the future. The news sentiment is 'Somewhat-Bullish' with a score of 0.232085.\n3. Samsung's Q4 revenue rose by 12% to $52.2B, releasing their plans for AI-driven premium product growth in 2025. This news doesn't have a direct impact on Apple but could be significant given the competition in the technology industry. The news sentiment is 'Neutral' with a score of 0.110925.\n4. Despite a decline in iPhone and China sales, Apple Inc.'s Q1 revenues and earnings per share were better than expected. The news sentiment is 'Somewhat-Bullish' with a score of 0.345283.\n5. Q1 fiscal 2025 results of AAPL benefited from strong services growth, despite a decline in iPhone sales. The news sentiment is 'Neutral' with a score of 0.070957.\n\nIn summary, while the recent closing price data shows some volatility with a downward trend in January 2025, the news sentiment for Apple Inc. is generally positive, indicating optimism about the company's financials and growth prospects. However, investors should constantly keep an eye on the news and financial updates to be aware of the potential risks and changes affecting the company's performance.<\/code><\/pre>\n\n\n\n<p>And that&#8217;s how you can build a simple RAG pipeline with Magentic.<\/p>\n\n\n\n<a name=\"magentic-full-code\">\n\n\n\n<h2 class=\"wp-block-heading\">Full code<\/h2>\n\n\n\n<p><a href=\"https:\/\/github.com\/AlgoTrading101\/Magentic-AlgoTrading101\">GitHub Link<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Table of contents: What is Magentic? Magentic is an open-source framework that allows for the seamless integration of Large Language Models (LLMs) into Python code. Website: Magentic GitHub Repository: jackmpcollins\/magentic: Seamlessly integrate LLMs as Python functions Why should I use Magentic? Why shouldn&#8217;t I use Magentic? What are some Magentic alternatives? Magentic alternatives are the [&hellip;]<\/p>\n","protected":false},"author":14,"featured_media":22805,"comment_status":"closed","ping_status":"closed","sticky":true,"template":"","format":"standard","meta":{"_lmt_disableupdate":"no","_lmt_disable":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0},"categories":[3],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v20.7 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How to build LLM Agents with Magentic - AlgoTrading101 Blog<\/title>\n<meta name=\"description\" content=\"Magentic is an open-source framework that allows for the integration of Large Language Models (LLMs) into Python code.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How to build LLM Agents with Magentic - AlgoTrading101 Blog\" \/>\n<meta property=\"og:description\" content=\"Magentic is an open-source framework that allows for the integration of Large Language Models (LLMs) into Python code.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/\" \/>\n<meta property=\"og:site_name\" content=\"Quantitative Trading Ideas and Guides - AlgoTrading101 Blog\" \/>\n<meta property=\"article:published_time\" content=\"2025-01-31T14:46:05+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-02-04T21:45:51+00:00\" \/>\n<meta property=\"og:image\" content=\"http:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"900\" \/>\n\t<meta property=\"og:image:height\" content=\"675\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Igor Radovanovic\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Igor Radovanovic\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"15 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How to build LLM Agents with Magentic - AlgoTrading101 Blog","description":"Magentic is an open-source framework that allows for the integration of Large Language Models (LLMs) into Python code.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/","og_locale":"en_US","og_type":"article","og_title":"How to build LLM Agents with Magentic - AlgoTrading101 Blog","og_description":"Magentic is an open-source framework that allows for the integration of Large Language Models (LLMs) into Python code.","og_url":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/","og_site_name":"Quantitative Trading Ideas and Guides - AlgoTrading101 Blog","article_published_time":"2025-01-31T14:46:05+00:00","article_modified_time":"2025-02-04T21:45:51+00:00","og_image":[{"width":900,"height":675,"url":"http:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2025\/01\/magentic_article_thumbnail.webp","type":"image\/webp"}],"author":"Igor Radovanovic","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Igor Radovanovic","Est. reading time":"15 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/#article","isPartOf":{"@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/"},"author":{"name":"Igor Radovanovic","@id":"https:\/\/algotrading101.com\/learn\/#\/schema\/person\/a7ae60c112a73b7c3fe14ac56726a0ae"},"headline":"How to build LLM Agents with Magentic","datePublished":"2025-01-31T14:46:05+00:00","dateModified":"2025-02-04T21:45:51+00:00","mainEntityOfPage":{"@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/"},"wordCount":1970,"publisher":{"@id":"https:\/\/algotrading101.com\/learn\/#organization"},"articleSection":["Programming"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/","url":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/","name":"How to build LLM Agents with Magentic - AlgoTrading101 Blog","isPartOf":{"@id":"https:\/\/algotrading101.com\/learn\/#website"},"datePublished":"2025-01-31T14:46:05+00:00","dateModified":"2025-02-04T21:45:51+00:00","description":"Magentic is an open-source framework that allows for the integration of Large Language Models (LLMs) into Python code.","breadcrumb":{"@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/algotrading101.com\/learn\/magentic-llm-guide\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/algotrading101.com\/learn\/"},{"@type":"ListItem","position":2,"name":"How to build LLM Agents with Magentic"}]},{"@type":"WebSite","@id":"https:\/\/algotrading101.com\/learn\/#website","url":"https:\/\/algotrading101.com\/learn\/","name":"Quantitative Trading Ideas and Guides - AlgoTrading101 Blog","description":"Authentic Stories about Algorithmic trading, coding and life.","publisher":{"@id":"https:\/\/algotrading101.com\/learn\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/algotrading101.com\/learn\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/algotrading101.com\/learn\/#organization","name":"AlgoTrading101","url":"https:\/\/algotrading101.com\/learn\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/algotrading101.com\/learn\/#\/schema\/logo\/image\/","url":"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2020\/11\/AlgoTrading101-Lucas-Liew.jpg","contentUrl":"https:\/\/algotrading101.com\/learn\/wp-content\/uploads\/2020\/11\/AlgoTrading101-Lucas-Liew.jpg","width":1200,"height":627,"caption":"AlgoTrading101"},"image":{"@id":"https:\/\/algotrading101.com\/learn\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/algotrading101.com\/learn\/#\/schema\/person\/a7ae60c112a73b7c3fe14ac56726a0ae","name":"Igor Radovanovic","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/algotrading101.com\/learn\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/d46175c509b3ee240a1e2bbe735a4d1e?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/d46175c509b3ee240a1e2bbe735a4d1e?s=96&d=mm&r=g","caption":"Igor Radovanovic"},"sameAs":["https:\/\/igorradovanovic.com","https:\/\/www.linkedin.com\/in\/igor-radovanovic-profile"],"url":"https:\/\/algotrading101.com\/learn\/author\/igor\/"}]}},"modified_by":"Lucas Liew","_links":{"self":[{"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/posts\/22818"}],"collection":[{"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/users\/14"}],"replies":[{"embeddable":true,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/comments?post=22818"}],"version-history":[{"count":2,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/posts\/22818\/revisions"}],"predecessor-version":[{"id":22834,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/posts\/22818\/revisions\/22834"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/media\/22805"}],"wp:attachment":[{"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/media?parent=22818"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/categories?post=22818"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/algotrading101.com\/learn\/wp-json\/wp\/v2\/tags?post=22818"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}