ARDUINO
VENTUNO Q
A maker's perspective. This music video explores the history of the Arduino ecosystem’s strategic response to the shifting hardware landscape. It tracks the journey from the 8-bit legacy of the Arduino Uno R3 through the disruptive arrival of the Raspberry Pi Pico, culminating in the release of the VentUNO Q—the high-performance answer to the Nvidia Jetson Nano Orin.

network_node Explore the Innovation Stream [SYSTEM UPDATE]
Dive into real-world projects and cutting-edge insights from the VideotronicMaker Labs. Discover how individuals are building, code coding, and mastering future tech, one science experiment at a time.
Have you forged a path of ingenuity with a unique tech project? We'd love to hear your story and potentially feature your work in the VideotronicMaker Labs!
rocket_launch Read the latest feature! [SYSTEM UPDATE]

David Antwi's Lab
David Antwi is shipping the future—imperfectly. Discover the inspiring origin story of a self-taught builder who went from hacking electronics at home to designing autonomous rovers and soft robots at Yale University.
VTM OS: THE DIRECT PATH
Utilizing the Uno Q as a high-performance Linux SBC. Real-time build logs for the VTM OS and Desk Bot.
Tutorials
- Client
- VideotronicMaker.com
- Website
- View website
Tutorial on how to use LM Studio without the Chat UI using a local server. Deploy an open source LLM on LM Studio on your pc or mac without an internet connection. The files in this video include the function to initiate a conversation with the local-model and establishes roles and where the instructions come from. The setup allows the script to dynamically read the system message from text file, making it easy to update the system message, system prompt or pre-prompt (known in Chat GPT as custom instructions) without changing the script's code.- Client
- VideotronicMaker.com
- Website
- View website
Tutorial on how to use LM Studio without the Chat UI using a local server. Deploy an open source LLM on LM Studio on your pc or mac without an internet connection. The files in this video include the function to initiate a conversation with the local-model and establishes roles and where the instructions come from. The setup allows the script to dynamically read the system message from text file, making it easy to update the system message, system prompt or pre-prompt (known in Chat GPT as custom instructions) without changing the script's code.- Tutorial on how to use LM Studio without the Chat UI using a local server. Deploy an open source LLM on LM Studio on your pc or mac without an internet connection. The files in this video include the function to initiate a conversation with the local-model and establishes roles and where the instructions come from. The setup allows the script to dynamically read the system message from text file, making it easy to update the system message, system prompt or pre-prompt (known in Chat GPT as custom instructions) without changing the script's code.
BEFORE AND AFTER
Watch a protoshield come to life.


Ever wonder what goes into making a protoshield useful? Slide to see the completed build.








