<rss
      xmlns:atom="http://www.w3.org/2005/Atom"
      xmlns:media="http://search.yahoo.com/mrss/"
      xmlns:content="http://purl.org/rss/1.0/modules/content/"
      xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd"
      xmlns:dc="http://purl.org/dc/elements/1.1/"
      version="2.0"
    >
      <channel>
        <title><![CDATA[Mud Pie Analytics]]></title>
        <description><![CDATA[Because everyone loves pie charts! 

We offer analytics solutions, consulting, and training.

Running #Linux on #Framework Hardware with Focus on #FOSS


Banner Photo by Choong Deng Xiang on Unsplash
https://unsplash.com/@dengxiangs]]></description>
        <link>https://mudpieanalytics.npub.pro/tag/metabase/</link>
        <atom:link href="https://mudpieanalytics.npub.pro/tag/metabase/rss/" rel="self" type="application/rss+xml"/>
        <itunes:new-feed-url>https://mudpieanalytics.npub.pro/tag/metabase/rss/</itunes:new-feed-url>
        <itunes:author><![CDATA[mudpieanalytics]]></itunes:author>
        <itunes:subtitle><![CDATA[Because everyone loves pie charts! 

We offer analytics solutions, consulting, and training.

Running #Linux on #Framework Hardware with Focus on #FOSS


Banner Photo by Choong Deng Xiang on Unsplash
https://unsplash.com/@dengxiangs]]></itunes:subtitle>
        <itunes:type>episodic</itunes:type>
        <itunes:owner>
          <itunes:name><![CDATA[mudpieanalytics]]></itunes:name>
          <itunes:email><![CDATA[mudpieanalytics]]></itunes:email>
        </itunes:owner>
            
      <pubDate>Sun, 16 Mar 2025 01:15:24 GMT</pubDate>
      <lastBuildDate>Sun, 16 Mar 2025 01:15:24 GMT</lastBuildDate>
      
      <itunes:image href="https://image.nostr.build/504a44049c5d186dea958d25f6467910bb47f44877b0dd7b2f3f5341f733b1b5.jpg" />
      
      <item>
      <title><![CDATA[As pointed out in Databricks' Generative…]]></title>
      <description><![CDATA[As pointed out in Databricks' Generative AI Fundamentals course, which I strongly recommend (it's free and really good), AI “is a necessary condition to compete, but it is not a sufficient condition to differentiate yourself in the market.” AI tools and LLMs are widely available and accessible, including local…]]></description>
             <itunes:subtitle><![CDATA[As pointed out in Databricks' Generative AI Fundamentals course, which I strongly recommend (it's free and really good), AI “is a necessary condition to compete, but it is not a sufficient condition to differentiate yourself in the market.” AI tools and LLMs are widely available and accessible, including local…]]></itunes:subtitle>
      <pubDate>Sun, 16 Mar 2025 01:15:24 GMT</pubDate>
      <link>https://mudpieanalytics.npub.pro/post/note10gcnz3zu3ghgn7dw8lf6svk9fdleplja39yky9e0wyz48sfuf23sll6lw6/</link>
      <comments>https://mudpieanalytics.npub.pro/post/note10gcnz3zu3ghgn7dw8lf6svk9fdleplja39yky9e0wyz48sfuf23sll6lw6/</comments>
      <guid isPermaLink="false">note10gcnz3zu3ghgn7dw8lf6svk9fdleplja39yky9e0wyz48sfuf23sll6lw6</guid>
      <category>llms</category>
      
        <media:content url="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg" medium="image"/>
        <enclosure 
          url="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg" length="0" 
          type="image/jpeg" 
        />
      <noteId>note10gcnz3zu3ghgn7dw8lf6svk9fdleplja39yky9e0wyz48sfuf23sll6lw6</noteId>
      <npub>npub1pk7tnp53zx4x9kwgd59490qykk0tece8k2864c88v3jqcu9marzstxcenu</npub>
      <dc:creator><![CDATA[mudpieanalytics]]></dc:creator>
      <content:encoded><![CDATA[<p>As pointed out in Databricks' Generative AI Fundamentals course, which I strongly recommend (it's free and really good), AI “is a necessary condition to compete, but it is not a sufficient condition to differentiate yourself in the market.” AI tools and LLMs are widely available and accessible, including local LLMs.<br><br>So, how can LLMs be leveraged in a way that is practical and contributes to success?<br><br>From what I have seen and experimented with so far, one winning formula seems to be the following:<br><br>Local LLMs + Proprietary Data = Comparative and Competitive Advantage<br>Getting started with running local LLMs is as easy as navigating to Ollama and running the following commands (Linux example, of course):<br><br>Download Ollama:<br><br>curl -fsSL <np-embed url="https://ollama.com/install.sh"><a href="https://ollama.com/install.sh">https://ollama.com/install.sh</a></np-embed> | sh<br>Run llama 3.2:1b (example):<br><br>ollama run llama3.2:1b<br>Check the list of downloaded models:<br><br>ollama list<br>Verify Ollama is running locally (that's right, you don't need an internet connection). <br><br><np-embed url="http://localhost:11434/"><a href="http://localhost:11434/">http://localhost:11434/</a></np-embed><br><br><br>Here are some advantages of running local LLMs as opposed to relying on proprietary models:<br><br>Free<br>No Contracts<br>Control<br>Flexibility<br>Choices<br>Security<br>Anonymity<br><br>However, appropriate hardware and know-how is needed for successful executions.<br><br>As we are told, there are no perfect models; only trade-offs. So, how do you choose and evaluate various options across family of models such as llama, hermes, mistral, etc. and select the appropriate size within a model (1.5b, 3b, 70b)? I call this horizontal vs vertical comparison, respectively.<br><br>Obviously, use cases and hardware requirements need to be considered, but beyond that, in addition to running prompts ad-hoc in the terminal or using the Open Web UI, a systematic way to compare models is needed for analytics purposes, especially if we want to follow the winning formula and take a full advantage of our data. Here is high level overview:<br><a href="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg"></a><br><br>Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) and hardware constant, and systematically capturing outputs has proven incredibly empowering, especially when the workflow is already integrated with proprietary data sources. Needless to mention, data wrangling is often needed prior to running analytics, as well as LLMs, and Knime workflows are great at that.<br><br>In this workflow, as a POC, I compare llama3.2:1b vs llama3.2:3b vs deepseek-r1:1.5b for sentiment analysis and topic modeling. The prompt as well as the responses of each of the three models is captured and can then be easily compared and analyzed: <br><a href="https://image.nostr.build/3d4c8b18957a5c2ffce1454c16d86092891e6486d12305a37048c5635ae97ee6.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/3d4c8b18957a5c2ffce1454c16d86092891e6486d12305a37048c5635ae97ee6.jpg"></a><br><br>When new LLMs are released, the workflow provides a flexible way to incorporate and test them. Such framework provides a systematic pipeline to experiment, evaluate, productionalize, and monitor models' outputs. Great for POCs!<br><br>Once the appropriate model is selected, the workflow can run local LLMs for analytics use cases such as sentiment analysis, topic modeling, etc. And to visualize the results in Metabase (coming soon as another FOSS Analytics segment) for a seamless FOSS Analytics experience is priceless (literally).<br><br>It is all about utilizing tools and building workflows in an integrating and systematic manner. <br><br>This end-to-end FOSS Analytics process flow is running on Linux on Framework hardware - the perfect combination for optimal performance and flexibility.<br><br>Join me in democratizing access to analytics capabilities and empowering people and organizations with data-driven insights! <br><br><a href='/tag/llms/'>#LLMs</a> <a href='/tag/knime/'>#Knime</a> <a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/knime/'>#knime</a> <a href='/tag/metabase/'>#metabase</a>  <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a> <a href='/tag/ai/'>#AI</a><br><br></p>
]]></content:encoded>
      <itunes:author><![CDATA[mudpieanalytics]]></itunes:author>
      <itunes:summary><![CDATA[<p>As pointed out in Databricks' Generative AI Fundamentals course, which I strongly recommend (it's free and really good), AI “is a necessary condition to compete, but it is not a sufficient condition to differentiate yourself in the market.” AI tools and LLMs are widely available and accessible, including local LLMs.<br><br>So, how can LLMs be leveraged in a way that is practical and contributes to success?<br><br>From what I have seen and experimented with so far, one winning formula seems to be the following:<br><br>Local LLMs + Proprietary Data = Comparative and Competitive Advantage<br>Getting started with running local LLMs is as easy as navigating to Ollama and running the following commands (Linux example, of course):<br><br>Download Ollama:<br><br>curl -fsSL <np-embed url="https://ollama.com/install.sh"><a href="https://ollama.com/install.sh">https://ollama.com/install.sh</a></np-embed> | sh<br>Run llama 3.2:1b (example):<br><br>ollama run llama3.2:1b<br>Check the list of downloaded models:<br><br>ollama list<br>Verify Ollama is running locally (that's right, you don't need an internet connection). <br><br><np-embed url="http://localhost:11434/"><a href="http://localhost:11434/">http://localhost:11434/</a></np-embed><br><br><br>Here are some advantages of running local LLMs as opposed to relying on proprietary models:<br><br>Free<br>No Contracts<br>Control<br>Flexibility<br>Choices<br>Security<br>Anonymity<br><br>However, appropriate hardware and know-how is needed for successful executions.<br><br>As we are told, there are no perfect models; only trade-offs. So, how do you choose and evaluate various options across family of models such as llama, hermes, mistral, etc. and select the appropriate size within a model (1.5b, 3b, 70b)? I call this horizontal vs vertical comparison, respectively.<br><br>Obviously, use cases and hardware requirements need to be considered, but beyond that, in addition to running prompts ad-hoc in the terminal or using the Open Web UI, a systematic way to compare models is needed for analytics purposes, especially if we want to follow the winning formula and take a full advantage of our data. Here is high level overview:<br><a href="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg"></a><br><br>Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) and hardware constant, and systematically capturing outputs has proven incredibly empowering, especially when the workflow is already integrated with proprietary data sources. Needless to mention, data wrangling is often needed prior to running analytics, as well as LLMs, and Knime workflows are great at that.<br><br>In this workflow, as a POC, I compare llama3.2:1b vs llama3.2:3b vs deepseek-r1:1.5b for sentiment analysis and topic modeling. The prompt as well as the responses of each of the three models is captured and can then be easily compared and analyzed: <br><a href="https://image.nostr.build/3d4c8b18957a5c2ffce1454c16d86092891e6486d12305a37048c5635ae97ee6.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/3d4c8b18957a5c2ffce1454c16d86092891e6486d12305a37048c5635ae97ee6.jpg"></a><br><br>When new LLMs are released, the workflow provides a flexible way to incorporate and test them. Such framework provides a systematic pipeline to experiment, evaluate, productionalize, and monitor models' outputs. Great for POCs!<br><br>Once the appropriate model is selected, the workflow can run local LLMs for analytics use cases such as sentiment analysis, topic modeling, etc. And to visualize the results in Metabase (coming soon as another FOSS Analytics segment) for a seamless FOSS Analytics experience is priceless (literally).<br><br>It is all about utilizing tools and building workflows in an integrating and systematic manner. <br><br>This end-to-end FOSS Analytics process flow is running on Linux on Framework hardware - the perfect combination for optimal performance and flexibility.<br><br>Join me in democratizing access to analytics capabilities and empowering people and organizations with data-driven insights! <br><br><a href='/tag/llms/'>#LLMs</a> <a href='/tag/knime/'>#Knime</a> <a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/knime/'>#knime</a> <a href='/tag/metabase/'>#metabase</a>  <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a> <a href='/tag/ai/'>#AI</a><br><br></p>
]]></itunes:summary>
      <itunes:image href="https://image.nostr.build/77df427dd32ea5328ed2c1d29a775685d697b69259ef8317b085cafad2829d87.jpg"/>
      </item>
      
      <item>
      <title><![CDATA[Running local LLMs in Knime workflows…]]></title>
      <description><![CDATA[Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) constant, and systematically capturing outputs is very empowering.

Use cases include sentiment analysis, outlier detection, etc.

The results are then visualized in Metabase.

End-to-End FOSS Analytics process flow running in Linux on…]]></description>
             <itunes:subtitle><![CDATA[Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) constant, and systematically capturing outputs is very empowering.

Use cases include sentiment analysis, outlier detection, etc.

The results are then visualized in Metabase.

End-to-End FOSS Analytics process flow running in Linux on…]]></itunes:subtitle>
      <pubDate>Sun, 09 Mar 2025 20:08:28 GMT</pubDate>
      <link>https://mudpieanalytics.npub.pro/post/note1p9ccma5jfx2gplfqelaf79ptcfzqteyf5e0xkwzadmy4nhrusvaqljy9ju/</link>
      <comments>https://mudpieanalytics.npub.pro/post/note1p9ccma5jfx2gplfqelaf79ptcfzqteyf5e0xkwzadmy4nhrusvaqljy9ju/</comments>
      <guid isPermaLink="false">note1p9ccma5jfx2gplfqelaf79ptcfzqteyf5e0xkwzadmy4nhrusvaqljy9ju</guid>
      <category>tech</category>
      
        <media:content url="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg" medium="image"/>
        <enclosure 
          url="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg" length="0" 
          type="image/jpeg" 
        />
      <noteId>note1p9ccma5jfx2gplfqelaf79ptcfzqteyf5e0xkwzadmy4nhrusvaqljy9ju</noteId>
      <npub>npub1pk7tnp53zx4x9kwgd59490qykk0tece8k2864c88v3jqcu9marzstxcenu</npub>
      <dc:creator><![CDATA[mudpieanalytics]]></dc:creator>
      <content:encoded><![CDATA[<p>Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) constant, and systematically capturing outputs is very empowering.<br><br>Use cases include sentiment analysis, outlier detection, etc.<br><br>The results are then visualized in Metabase.<br><br>End-to-End FOSS Analytics process flow running in Linux on a Framework 13. <br><a href="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg"></a><br><br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/knime/'>#knime</a> <a href='/tag/metabase/'>#metabase</a>  <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a></p>
]]></content:encoded>
      <itunes:author><![CDATA[mudpieanalytics]]></itunes:author>
      <itunes:summary><![CDATA[<p>Running local LLMs in Knime workflows for model comparison for analytics use cases, holding prompt(s) constant, and systematically capturing outputs is very empowering.<br><br>Use cases include sentiment analysis, outlier detection, etc.<br><br>The results are then visualized in Metabase.<br><br>End-to-End FOSS Analytics process flow running in Linux on a Framework 13. <br><a href="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg"></a><br><br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/knime/'>#knime</a> <a href='/tag/metabase/'>#metabase</a>  <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a></p>
]]></itunes:summary>
      <itunes:image href="https://image.nostr.build/81a7889c04a9588e243f4a61cae7fc927d0318d517eff726fda269a6766ee619.jpg"/>
      </item>
      
      <item>
      <title><![CDATA[We just updated #Metabase to the…]]></title>
      <description><![CDATA[We just updated #Metabase to the latest v52 release, running locally in a #Docker container.

 

#tech #foss #analytics #software #linux #framework #notetaking #productivity #IT #statistics #data #datavisualization #BI #data #database #fossanalytics #metabase #docker…]]></description>
             <itunes:subtitle><![CDATA[We just updated #Metabase to the latest v52 release, running locally in a #Docker container.

 

#tech #foss #analytics #software #linux #framework #notetaking #productivity #IT #statistics #data #datavisualization #BI #data #database #fossanalytics #metabase #docker…]]></itunes:subtitle>
      <pubDate>Fri, 13 Dec 2024 04:42:39 GMT</pubDate>
      <link>https://mudpieanalytics.npub.pro/post/note1wj7kdksxurg8aveffl0rsr8xqk6jx0fxglr0sh5vy3svlr0sm8vs6huyyl/</link>
      <comments>https://mudpieanalytics.npub.pro/post/note1wj7kdksxurg8aveffl0rsr8xqk6jx0fxglr0sh5vy3svlr0sm8vs6huyyl/</comments>
      <guid isPermaLink="false">note1wj7kdksxurg8aveffl0rsr8xqk6jx0fxglr0sh5vy3svlr0sm8vs6huyyl</guid>
      <category>metabase</category>
      
        <media:content url="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg" medium="image"/>
        <enclosure 
          url="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg" length="0" 
          type="image/jpeg" 
        />
      <noteId>note1wj7kdksxurg8aveffl0rsr8xqk6jx0fxglr0sh5vy3svlr0sm8vs6huyyl</noteId>
      <npub>npub1pk7tnp53zx4x9kwgd59490qykk0tece8k2864c88v3jqcu9marzstxcenu</npub>
      <dc:creator><![CDATA[mudpieanalytics]]></dc:creator>
      <content:encoded><![CDATA[<p>We just updated <a href='/tag/metabase/'>#Metabase</a> to the latest v52 release, running locally in a <a href='/tag/docker/'>#Docker</a> container.<br><br> <a href="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg"></a><br><br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a></p>
]]></content:encoded>
      <itunes:author><![CDATA[mudpieanalytics]]></itunes:author>
      <itunes:summary><![CDATA[<p>We just updated <a href='/tag/metabase/'>#Metabase</a> to the latest v52 release, running locally in a <a href='/tag/docker/'>#Docker</a> container.<br><br> <a href="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg" class="vbx-media" target="_blank"><img class="venobox" src="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg"></a><br><br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a> <a href='/tag/data/'>#data</a> <a href='/tag/database/'>#database</a> <a href='/tag/fossanalytics/'>#fossanalytics</a> <a href='/tag/metabase/'>#metabase</a> <a href='/tag/docker/'>#docker</a></p>
]]></itunes:summary>
      <itunes:image href="https://image.nostr.build/1e86f3df4a42eb27c4eb4a57d3aff6377d737f0e67643d192c7e7ef2c16c5b0c.jpg"/>
      </item>
      
      <item>
      <title><![CDATA[FOSS Analytics stack running on Linux/…]]></title>
      <description><![CDATA[FOSS Analytics stack running on Linux/Framework 13.  

Note Taking/Code Versioning : #Joplin, #Notepadnext, and #Nodepadqq

Office Suite: #LibreOffice | #OnlyOffice

Data Workflow: #Knime

BI and Dashboards: #Metabase

AI: Local  LLMs with Open WebUI
 
#tech #foss #analytics #software #linux #framework #notetaking #productivity #IT #statistics #data #datavisualization #BI…]]></description>
             <itunes:subtitle><![CDATA[FOSS Analytics stack running on Linux/Framework 13.  

Note Taking/Code Versioning : #Joplin, #Notepadnext, and #Nodepadqq

Office Suite: #LibreOffice | #OnlyOffice

Data Workflow: #Knime

BI and Dashboards: #Metabase

AI: Local  LLMs with Open WebUI
 
#tech #foss #analytics #software #linux #framework #notetaking #productivity #IT #statistics #data #datavisualization #BI…]]></itunes:subtitle>
      <pubDate>Sun, 17 Nov 2024 01:32:51 GMT</pubDate>
      <link>https://mudpieanalytics.npub.pro/post/note1k5hc32872fm4fdweeyawvd33h7rqq60gampm93f9a0sgv4s4pjrslzv0fl/</link>
      <comments>https://mudpieanalytics.npub.pro/post/note1k5hc32872fm4fdweeyawvd33h7rqq60gampm93f9a0sgv4s4pjrslzv0fl/</comments>
      <guid isPermaLink="false">note1k5hc32872fm4fdweeyawvd33h7rqq60gampm93f9a0sgv4s4pjrslzv0fl</guid>
      <category>Joplin</category>
      
        <media:content url="https://m.primal.net/MYVZ.png" medium="image"/>
        <enclosure 
          url="https://m.primal.net/MYVZ.png" length="0" 
          type="image/png" 
        />
      <noteId>note1k5hc32872fm4fdweeyawvd33h7rqq60gampm93f9a0sgv4s4pjrslzv0fl</noteId>
      <npub>npub1pk7tnp53zx4x9kwgd59490qykk0tece8k2864c88v3jqcu9marzstxcenu</npub>
      <dc:creator><![CDATA[mudpieanalytics]]></dc:creator>
      <content:encoded><![CDATA[<p>FOSS Analytics stack running on Linux/Framework 13.  <br><br>Note Taking/Code Versioning : <a href='/tag/joplin/'>#Joplin</a>, <a href='/tag/notepadnext/'>#Notepadnext</a>, and <a href='/tag/nodepadqq/'>#Nodepadqq</a><br><br>Office Suite: <a href='/tag/libreoffice/'>#LibreOffice</a> | <a href='/tag/onlyoffice/'>#OnlyOffice</a><br><br>Data Workflow: <a href='/tag/knime/'>#Knime</a><br><br>BI and Dashboards: <a href='/tag/metabase/'>#Metabase</a><br><br>AI: Local  LLMs with Open WebUI<br><a href="https://m.primal.net/MYVZ.png" class="vbx-media" target="_blank"><img class="venobox" src="https://m.primal.net/MYVZ.png"></a> <br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a><br></p>
]]></content:encoded>
      <itunes:author><![CDATA[mudpieanalytics]]></itunes:author>
      <itunes:summary><![CDATA[<p>FOSS Analytics stack running on Linux/Framework 13.  <br><br>Note Taking/Code Versioning : <a href='/tag/joplin/'>#Joplin</a>, <a href='/tag/notepadnext/'>#Notepadnext</a>, and <a href='/tag/nodepadqq/'>#Nodepadqq</a><br><br>Office Suite: <a href='/tag/libreoffice/'>#LibreOffice</a> | <a href='/tag/onlyoffice/'>#OnlyOffice</a><br><br>Data Workflow: <a href='/tag/knime/'>#Knime</a><br><br>BI and Dashboards: <a href='/tag/metabase/'>#Metabase</a><br><br>AI: Local  LLMs with Open WebUI<br><a href="https://m.primal.net/MYVZ.png" class="vbx-media" target="_blank"><img class="venobox" src="https://m.primal.net/MYVZ.png"></a> <br><a href='/tag/tech/'>#tech</a> <a href='/tag/foss/'>#foss</a> <a href='/tag/analytics/'>#analytics</a> <a href='/tag/software/'>#software</a> <a href='/tag/linux/'>#linux</a> <a href='/tag/framework/'>#framework</a> <a href='/tag/notetaking/'>#notetaking</a> <a href='/tag/productivity/'>#productivity</a> <a href='/tag/it/'>#IT</a> <a href='/tag/statistics/'>#statistics</a> <a href='/tag/data/'>#data</a> <a href='/tag/datavisualization/'>#datavisualization</a> <a href='/tag/bi/'>#BI</a><br></p>
]]></itunes:summary>
      <itunes:image href="https://m.primal.net/MYVZ.png"/>
      </item>
      
      </channel>
      </rss>
    