Dynamo BIM https://dynamobim.org/ Dynamo is a visual programming tool that aims to be accessible to both non-programmers and programmers alike. Tue, 03 Mar 2026 20:20:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 đŸŽč “Auld Lang Syne” Player – Dynamo Forum 2025 Winner https://dynamobim.org/%f0%9f%8e%b9-auld-lang-syne-player-dynamo-forum-2025-winner/ https://dynamobim.org/%f0%9f%8e%b9-auld-lang-syne-player-dynamo-forum-2025-winner/#respond Tue, 03 Mar 2026 19:34:29 +0000 https://dynamobim.org/?p=56522 Dynamo Team Note: At the close of 2025, the Dynamo community gathered for the “Good Old-Fashioned Holiday Modeling Contest with Dynamo”, inviting members to create festive and imaginative projects using Dynamo. You can read about the original contest here: https://forum.dynamobim.com/t/dynamo-challenge-good-old-fashioned-holiday-modeling-contest-with-dynamo/113324 We are delighted to share this guest blog post featuring the winning entry from community …

The post đŸŽč “Auld Lang Syne” Player – Dynamo Forum 2025 Winner appeared first on Dynamo BIM.

]]>
Dynamo Team Note:

At the close of 2025, the Dynamo community gathered for the “Good Old-Fashioned Holiday Modeling Contest with Dynamo”, inviting members to create festive and imaginative projects using Dynamo. You can read about the original contest here: https://forum.dynamobim.com/t/dynamo-challenge-good-old-fashioned-holiday-modeling-contest-with-dynamo/113324

We are delighted to share this guest blog post featuring the winning entry from community member, Alien. This entry is the “Auld Lang Syne” Player, a playful and technically inventive way to ring in the New Year with Dynamo.


What Does It Do?

Dynamo plays Auld Lang Syne, the traditional Scottish New Year’s song, by iterating through an ordered list of sound files.

As the tune plays, a miniature piano — created entirely with Dynamo geometry — lights up the correct key in real time, colouring each note as it sounds.

It’s part technical experiment, part festive absurdity
 and entirely Dynamo.


🎬 Video (sound on!)


Setup

  • Created on a Windows 11 PC

  • Using Dynamo for Revit 2025

  • Graph uses OOTB (Out of the Box) nodes

  • Includes the Dynamo Text package

    • Used only to write “Happy 2026”

    • Does not affect the tune


Why?

Dynamo forum competitions tend to appear quietly — often in the middle of the night for those of us in Europe.

There’s usually a delightfully vague brief and absolutely no sensible reason to attempt the challenge.

But we do it anyway.

Because it’s fun.

Because you might learn something.

Because suddenly Dynamo is being asked to do something it was never designed to do — using tools meant for geometry, data, and order.

The task isn’t about usefulness or optimisation.

It’s about:

  • Taking a vague idea

  • An ordered list

  • A Python node

And seeing whether something technically absurd can be made to happen.

In this case?

Dynamo playing Auld Lang Syne.


Thought Process

As a Brit, New Year and Auld Lang Syne are basically synonymous.

It’s on the TV.
It’s on the radio.
It’s being sung somewhere slightly out of tune every single year.

Just as you can’t really have Christmas without mince pies, you can’t have New Year without Auld Lang Syne turning up — whether you asked for it or not.

I’d already experimented with text-to-voice in Dynamo a couple of years earlier, so this wasn’t a huge leap into the unknown.

The process:

  1. Grab sheet music

  2. Translate notes into text Dynamo could understand

  3. Name the .wav files to match

    • C.wav

    • FS.wav

    • C_high.wav

    • etc.

After that, it was mostly:

Wire it together.
Hit Run.
See if Dynamo would actually play along.


How It Works

đŸŽč Piano Geometry

The piano keys were created using Cuboids in Dynamo.


đŸŽŒ Writing the Tune

The tune itself was written in text.

DateTime.Now was used to step through the ordered list of notes.


🐍 Python + Windows Sound

The note names were mapped to file locations in a Python node.

Windows’ winsound module was used to locate and play the corresponding .wav file for each note.


Upgrade

Since the original competition entry, the code has been upgraded so you can now play a piano inside Revit.

🔗 Scripts available on GitHub:
https://github.com/MoosiestMoose


Requirements

  • Windows operating system
    (required for winsound)

  • Dynamo Run Mode: Periodic

  • Sound files (.wav) stored locally

    • Each file contains a single piano note

    • Files must follow consistent naming

Example filenames:

C.wav
D.wav
E.wav
FS.wav
C_high.wav
D_high.wav


Step-by-Step Instructions

  1. Download the .wav files and store them locally.

  2. Open Dynamo.

  3. Open the “Py: play selected note” Python node.

  4. Enter the folder path where your .wav files are stored.

  5. Set Dynamo to Periodic.

  6. Pour a whiskey.

  7. Imagine it’s midnight on 31st December.

The post đŸŽč “Auld Lang Syne” Player – Dynamo Forum 2025 Winner appeared first on Dynamo BIM.

]]>
https://dynamobim.org/%f0%9f%8e%b9-auld-lang-syne-player-dynamo-forum-2025-winner/feed/ 0
Masters of Disasters: Dynamo Hackathon 2025 Winner https://dynamobim.org/masters-of-disasters-dynamo-hackathon-2025-winner/ https://dynamobim.org/masters-of-disasters-dynamo-hackathon-2025-winner/#respond Wed, 10 Dec 2025 16:44:13 +0000 https://dynamobim.wpenginepowered.com/?p=56481 When disaster strikes, every second counts. At the 2025 Dynamo Hackathon in Nashville, our team tackled one of the most pressing challenges facing emergency response teams today: how to rapidly deploy life-saving infrastructure when communities need it most. The result? An AI-powered disaster response system that transforms how we prepare for and respond to emergencies. …

The post Masters of Disasters: Dynamo Hackathon 2025 Winner appeared first on Dynamo BIM.

]]>
When disaster strikes, every second counts. At the 2025 Dynamo Hackathon in Nashville, our team tackled one of the most pressing challenges facing emergency response teams today: how to rapidly deploy life-saving infrastructure when communities need it most. The result? An AI-powered disaster response system that transforms how we prepare for and respond to emergencies.

Our team—Enrique Galicia, Aaron Coffman, Adrián Fernández, Daniel Drennen, Colin Molnar, and Pawan Bhat—came together with a shared vision: reimagine disaster response through the power of automation, intelligence, and speed. We set out to build a resilient, adaptive, and intelligent safety net capable of saving lives and minimizing disruption when disaster strikes.

The Challenge: Speed Meets Complexity

Natural disasters don’t wait for perfect planning. Whether it’s flooding, wildfires, or hurricanes, emergency responders face an impossible challenge: They need to quickly deploy temporary infrastructure—shelters, medical facilities, supply distribution centers—in unfamiliar terrain, often with limited information and under extreme time pressure.

Traditional disaster response planning is manual, time-consuming, and relies heavily on human decision-making under stress. Responders must:

  • Analyze unfamiliar terrain and infrastructure
  • Identify safe zones for emergency facilities
  • Calculate optimal routes through potentially damaged areas
  • Deploy resources without complete information
  • Make life-or-death decisions in minutes, not hours

Every minute of delay can mean the difference between life and death. The AEC industry has powerful tools for site analysis, pathfinding, and optimization, but these capabilities have never been applied to real-time disaster response at scale.

Our vision: Automated disaster response architecture

Masters of Disasters is a proposed intelligent system that would automate emergency infrastructure deployment from disaster detection to facility placement. By combining real-time disaster monitoring, AI-powered site analysis, and advanced pathfinding algorithms, we envision a workflow that could analyze a disaster site and deploy emergency infrastructure in minutes instead of hours.

The conceptual system works through four integrated stages:

  1. Real-Time Disaster Detection and Classification: We integrated disaster monitoring systems with FEMA, NOAA, and USGS databases to detect and classify disasters in real time. When an event occurs—whether flooding, earthquake, or other emergency—the system automatically receives location coordinates, disaster type, and severity data.

  1. Automated Site Analysis with Forma: Using Autodesk Forma, the system pulls comprehensive real-world site data for the affected area: topography, existing infrastructure, road networks, vegetation, and accessibility. This gives responders an instant, data-rich picture of the disaster zone without manual site surveys.

  1. AI-Powered Site Suitability Analysis: The heart of our system is AI-driven analysis that processes topographic data to identify optimal locations for emergency facilities. The algorithm considers:
    • Elevation and flood risk
    • Proximity to affected populations
    • Safety and accessibility
    • Ground conditions for rapid construction
    • Distance from hazard zones
  1. Optimal Pathfinding with VASA: Once facility locations are determined, the system uses VASA (Visual Area Space Analysis) within Dynamo to calculate the shortest, safest routes for emergency responders. VASA processes topography curves and meshes to find paths that account for:
    • Damaged or blocked roads
    • Debris fields
    • Topographic barriers
    • Multiple access points
    • Traffic flow considerations

Progression using VASA to identify two points with roads, then match possibilities and use closest distance to move between them.

Workflow: Translate mesh element cities in Forma to mesh analysis in VASA

The Technical Foundation: Agentic Tools and Integration

During the hackathon, we focused on proving the conceptual framework and identifying the technical requirements for implementation.

What We Built:

  • Conceptual architecture for Model Context Protocol (MCP)-Dynamo integration
  • Visual demonstrations of multiple disaster scenarios (flooding, hurricanes, earthquakes, biological threats)
  • Identification of required agentic tool capabilities
  • Mockups of user interfaces and workflow diagrams
  • Proof-of-concept for data visualization and conflict detection logic

What We Validated:

  • The workflow is technically feasible with existing tools (Forma, Dynamo, VASA, MCP)
  • Multiple disaster types can use the same core architecture
  • Emergency management professionals expressed strong interest
  • The approach scales beyond just disaster response to other rapid-deployment scenarios

The Agentic Tools Framework

Our proposed system would leverage several cutting-edge Dynamo capabilities:

Custom Agentic Nodes:

  • Toposolid extraction nodes that automatically slice terrain data at specified hazard levels (flood elevations, earthquake damage zones, etc.)
  • Geometric intersection nodes that identify conflict points between infrastructure and hazard zones
  • Path analysis nodes that call VASA algorithms for optimal routing
  • Geometry allocation nodes that place shelter and facility geometry at identified safe locations

Real-World Data Integration:

The system seamlessly pulls data from multiple sources:

  • Live disaster feeds from government agencies
  • Site topography and infrastructure from Forma
  • Road networks and accessibility data
  • Building footprints and facility locations

These components exist independently—our innovation is in connecting them through an intelligent MCP architecture that orchestrates the entire workflow.

The Architecture Decision: MCP Integration Strategy

One of the most critical insights from the hackathon was recognizing how Model Context Protocol (MCP) could revolutionize disaster response automation. We have the option of two distinct architectural approaches for production implementation since both features were integrated in the hackathon:

Option 1: Dynamo Nodes Generated by MCP

In this approach, MCP servers would dynamically generate custom Dynamo nodes for each disaster routine:

MCP Server → Generates Custom Nodes → Dynamo Executes Locally

The MCP (Model Context Protocol) requires a server enabling the functions to run and a client. The first can be achieved through the Alpha version of the Dynamo Autodesk Assistant by following a routine of elements to be triggered.

Hurricane query workflow in Autodesk Assistant Alpha

Autodesk Assistant can create nodes, Python scripts, or a mix of both. Autodesk Assistant is an MCP client running with the Dynamo MCP server, which enables Dynamo functions and leverages Autodesk AI.

Or you can use half-baked functions using the Agent Node:

The main component for this workflow is the AgentProcess.GetAllAvailableTools node, which retrieves MCPs loaded in Dynamo, such as the Revit MCP, to execute tasks.

You can also retrieve additional data from other MCPs. We tested the Excel MCP to access data from a spreadsheet. You can use the Send Request (variable inputs) node to enable functionality from other applications and bring it into Dynamo. This node uses Autodesk AI models and allows you to supply context, request, and other inputs.

Advantages:

  • Dynamo remains the primary execution environment
  • Leverage existing Dynamo Player deployment infrastructure
  • Engineers can visualize and debug the graph structure
  • Works within established IT security perimeters
  • No external API dependencies during execution

Challenges:

  • Node generation adds complexity
  • Updates require re-generating and re-deploying nodes
  • Limited flexibility once nodes are compiled

Option 2: MCP Server Executing Routines Directly

Alternatively, the disaster response routines live within MCP servers, called by lightweight Dynamo wrapper nodes:

Dynamo Thin Client → MCP Server API → Executes Complete Routines → Returns Results

A secondary alternative to this is to create an MCP with the functions using the Dynamo MCP, Revit API, or the Dynamo DLL.

Similar to the use of MCP on Claude, running them in the Dynamo Agent Alpha version enables workflows to be used as tools with Autodesk AI LLM services.

Advantages:

  • Centralized routine management and updates
  • Real-time algorithm improvements without client updates
  • Easier integration with live disaster data feeds
  • Scales across multiple Forma/Revit instances

Challenges:

  • Requires reliable network connectivity
  • API rate limits and latency considerations
  • More complex authentication and security architecture

Our Recommended Hybrid Approach

After extensive discussion, we’re leaning toward a hybrid architecture that leverages the strengths of both:

  1. Core Analysis Engine: MCP servers host the intelligence—AI models, disaster databases, optimization algorithms, real-time data integrations
  2. Dynamo Execution Layer: Lightweight nodes in Dynamo handle:
    • Geometry extraction from Revit/Forma
    • Visualization and user interaction
    • Local geometry operations
    • Final model updates
  1. Smart Caching: Results cache locally so repeated analyses don’t require constant API calls
  2. Graceful Degradation: If MCP servers are unreachable, fall back to last-known-good algorithms running locally

This architecture enables:

  • Rapid iteration: Update algorithms on MCP servers without touching client deployments
  • Offline capability: Core functions work even without connectivity
  • Natural language control: Emergency coordinators can describe scenarios in plain English
  • Multi-platform: The same MCP servers can serve Dynamo, Forma, custom web interfaces, or mobile apps

Why This Matters for Disaster Response

The MCP architecture isn’t just technically elegant—it’s operationally critical for emergency response:

Speed of Deployment: When a disaster strikes, there’s no time to update software. MCP servers can push updated algorithms instantly to all connected clients.

Learning from Each Event: Every disaster response generates data. MCP servers can incorporate lessons learned and improve recommendations in real-time.

Coordination Across Jurisdictions: Multiple agencies using different tools can all connect to the same MCP infrastructure, ensuring consistent analysis and coordinated response.

Future-Proofing: As new AI models emerge or disaster science evolves, the core intelligence upgrades without touching the hundreds or thousands of deployed Dynamo graphs.

This architectural approach transforms Masters of Disasters from a single-purpose tool into a platform for continuous improvement in emergency response capabilities.

Demonstration Scenarios: Visualizing the Possible

During the hackathon, we developed a series of conceptual demonstrations showing how the system would respond to different disaster types. While we didn’t complete a fully functional end-to-end system in 6 hours, we proved the core concepts and visualized the workflow for multiple scenarios:

Flooding Scenario

When rising water threatens a community, the system would:

  1. Receive flood coordinates and predicted water levels
  2. Extract site data showing terrain elevation
  3. Identify conflict points where buildings intersect with flood zones (shown in blue)
  4. Calculate safe zones at higher elevations (shown in green)
  5. Deploy emergency shelters on high ground with optimal accessibility

Hurricane Scenario
 High winds and storm surge require different shelter strategies:

  • Identifies structurally compromised buildings (shown in red)
  • Prioritizes wind-protected evacuation routes
  • Places reinforced shelters away from coastal exposure

Earthquake Scenario

Structural damage assessment drives facility placement:

  • Maps damaged infrastructure requiring evacuation (red zones)
  • Identifies open spaces safe from falling debris
  • Creates medical triage facility locations

Biological Contamination

Pandemic or chemical spills require isolation protocols:

  • Visualizes contamination spread patterns
  • Creates quarantine zones and containment perimeters
  • Places isolation facilities with controlled access points

And Yes… Zombie Apocalypse

Because every good hackathon needs a creative scenario:

  • Identifies safe zones away from population centers
  • Creates defensible shelter clusters
  • Optimizes supply distribution without exposure risk

Each scenario uses the same core agentic tools:

  • Toposolid extraction against hazard levels
  • Geometric intersection to detect conflicts
  • Path analysis for optimal routing
  • Geometry allocation for shelter placement

The variety of scenarios proved our approach was adaptable—the same fundamental workflow could address vastly different disaster types by adjusting the hazard parameters and placement constraints.

What We Achieved: Proof of Vision

In just 6 intense hours, we didn’t build a finished, deployable system—but we proved something arguably more important: that this vision is achievable and worth pursuing. The compressed timeframe forced us to focus on what truly mattered: validating the concept, identifying the architecture, and demonstrating the potential. Our proof of concept demonstrated:

Conceptual Validation: We mapped the complete workflow from disaster detection through facility deployment, identifying every data integration point, analysis step, and automation opportunity. In 6 hours, we did the strategic thinking that typically takes weeks.

Visual Demonstration: Through multiple disaster scenarios, we showed how the same core logic adapts to different emergency types—flooding, hurricanes, earthquakes, and more. Each scenario proved the adaptability of our approach.

Agentic Tool Architecture: We defined the specific agentic capabilities needed:

  • Toposolid extraction routines
  • Geometric conflict detection
  • Path analysis integration with VASA
  • Automated geometry placement algorithms

Stakeholder Enthusiasm: The response from judges, emergency management professionals, and fellow developers validated the real-world need and commercial viability.

Clear Implementation Path: Most importantly, we identified exactly what needs to be built and how to architect it for production deployment.

For organizations like FEMA and disaster relief agencies, our hackathon presentation demonstrated a compelling future: when disasters strike, AI-powered systems could have deployment plans ready in minutes instead of hours. The technology exists—now it’s about integration and implementation.

The Development Journey: Challenges and Breakthroughs

Like any hackathon project, Masters of Disasters came with significant technical challenges. Working with Forma’s site data API required careful data parsing and transformation. Getting VASA pathfinding to work with real topographic meshes—not just simplified test geometries—required multiple iterations and creative problem-solving.

One of our biggest breakthroughs came when we successfully integrated the conflict detection system. Initially, we struggled to reliably identify which facilities would be impacted by flooding. The solution came from using geometric solid intersections between the toposolid at the flood level and the building geometry. Once we had reliable conflict detection, the automatic relocation logic fell into place.

The AI-powered site suitability analysis also evolved significantly during development. Early iterations produced recommendations that looked good on paper but didn’t account for real-world constraints like soil conditions or existing utility infrastructure. We refined our prompts and added additional data layers to ensure the AI’s recommendations were genuinely deployable.

Future Steps to Follow

While our hackathon prototype focused on general scenarios, the potential applications are far broader. Future iterations of Masters of Disasters will:

Expand Disaster Type Coverage

  • Earthquake response with structural damage assessment
  • Wildfire evacuation planning with smoke and wind modeling
  • Hurricane preparedness with storm surge prediction
  • Pandemic response with facility isolation requirements

Enhance AI Capabilities

  • Predictive modeling to pre-position resources before disasters strike
  • Machine learning from past disaster responses to improve recommendations
  • Natural language interfaces for emergency coordinators
  • Real-time updates as disaster conditions evolve

Integrate with Emergency Systems

  • Direct API connections to FEMA response systems
  • Mobile apps for field coordinators
  • GIS integration for mapping and visualization
  • IoT sensors for real-time disaster monitoring

Deploy Automated Resource Allocation

  • Supply chain optimization for emergency materials
  • Personnel deployment and shift scheduling
  • Equipment tracking and utilization
  • Cost estimation and budget management

Leverage Dynamo as a Service (DaaS) Integration

  • Deploy the system on Autodesk Forma through DaaS, allowing emergency management agencies to access disaster response planning tools without specialized software installations.

Key Takeaways and Lessons Learned

This hackathon reinforced several critical lessons about building impactful technology:

  1. Focus on Real Impact: We didn’t chase flashy features. Every component of our system exists to save lives. That clarity of purpose kept us focused when decisions got tough.
  2. Leverage Existing Infrastructure: Rather than building everything from scratch, we integrated with proven platforms like Forma and VASA. This let us focus on the novel aspects of disaster response automation.
  3. Start Simple, Iterate Fast: Our first prototype was rough. But by getting something working quickly, we could test, learn, and improve.
  4. Diverse Teams Win: Our team’s varied backgrounds—architecture, engineering, systems design, program management—meant we approached problems from multiple angles. This diversity was our strength.
  5. The Future is Agentic: Dynamo’s agentic tools opened possibilities we couldn’t have achieved with traditional scripting. AI isn’t just about answering questions—it’s about autonomous systems that can analyze, decide, and act.
  6. Technology Can Save Lives: This wasn’t an academic exercise. The tools we built during this hackathon could genuinely help communities prepare for and respond to disasters. That responsibility motivated us every step of the way.

The Bigger Picture: AI in Emergency Response

Masters of Disasters represents something larger than a single hackathon project. It’s a glimpse into how AI and automation can transform emergency response across the board. As climate change increases the frequency and severity of natural disasters, we need tools that can scale to match the challenge.

The AEC industry has spent decades developing sophisticated tools for designing buildings and infrastructure. Now, we’re learning to apply those same capabilities to emergency response. The marriage of BIM, AI, and real-time data creates possibilities that didn’t exist even a few years ago.

For firms, agencies, and organizations involved in disaster response, the message is clear: automation isn’t coming someday—it’s here now. The question isn’t whether AI will transform emergency management, but how quickly we can deploy these capabilities to the people and places that need them most.

Conclusion: A Vision Worth Building

Hackathons are about possibilities. In just 6 hours—a single intense work session—we didn’t build a complete disaster response system, but we proved it’s possible, visualized how it would work, and identified the architecture to make it real.

What sets Masters of Disasters apart isn’t just the application to emergency response. It’s the recognition that Model Context Protocol represents a fundamental shift in how we build intelligent AEC applications. By separating the intelligence layer (MCP servers) from the execution layer (Dynamo, Forma, mobile apps), we create systems that can:

  • Learn continuously from every disaster
  • Update instantly without software deployments
  • Coordinate seamlessly across jurisdictions and platforms
  • Scale infinitely as demand requires

The challenge ahead is execution. We need partnerships with emergency management agencies, integration with government data systems, and validation from disaster response professionals. We need to build the MCP infrastructure, develop the algorithms, and prove the system works under real disaster conditions.

But we’re confident because the need is undeniable and the technology exists. Climate change is increasing disaster frequency and severity. Traditional manual planning can’t scale to meet the challenge. AI-powered automation isn’t a luxury—it’s becoming essential.

To the Dynamo and AEC community: This is what hackathons are for. Not just to build features, but to imagine new possibilities. Masters of Disasters started as a 6-hour sprint. With the right support and partnerships, it could save lives.

The next disaster is coming. Let’s make sure we’re ready.

Note from the Dynamo team: Would you like to try out MCP capabilities in Dynamo? Sign up for the Alpha today! More details in this Dynamo forum post.

Meet the Hackathon Team

Our diverse team brought together expertise from across the AEC and infrastructure industries:

The Masters of Disasters team thanks Autodesk, the Dynamo community, and the hackathon organizers for creating the space to imagine bold solutions. We acknowledge the emergency responders worldwide whose dedication inspired this vision, and we’re committed to turning this concept into reality.

The post Masters of Disasters: Dynamo Hackathon 2025 Winner appeared first on Dynamo BIM.

]]>
https://dynamobim.org/masters-of-disasters-dynamo-hackathon-2025-winner/feed/ 0
Dynamo PythonNet3 Upgrade: A Practical Guide to Migrating Your Dynamo Graphs https://dynamobim.org/dynamo-pythonnet3-upgrade-a-practical-guide-to-migrating-your-dynamo-graphs/ https://dynamobim.org/dynamo-pythonnet3-upgrade-a-practical-guide-to-migrating-your-dynamo-graphs/#respond Thu, 04 Dec 2025 22:17:35 +0000 https://dynamobim.wpenginepowered.com/?p=56447 With the release of Dynamo 4.0 , the Dynamo team shared that Dynamo_PythonNet3 is the new Python engine for Dynamo, based on CPython 3.11 and the Python.NET v3 library. This change is a significant moment in the ongoing evolution of Python scripting in Dynamo. The introduction of PythonNet3 marks a milestone compared to the previous …

The post Dynamo PythonNet3 Upgrade: A Practical Guide to Migrating Your Dynamo Graphs appeared first on Dynamo BIM.

]]>
With the release of Dynamo 4.0 , the Dynamo team shared that Dynamo_PythonNet3 is the new Python engine for Dynamo, based on CPython 3.11 and the Python.NET v3 library. This change is a significant moment in the ongoing evolution of Python scripting in Dynamo. The introduction of PythonNet3 marks a milestone compared to the previous CPython3 implementation (based on PythonNet 2.5). While the latter could be frustrating due to its lack of flexibility within the .NET ecosystem, PythonNet3 effectively bridges these gaps.

This new engine succeeds in the difficult task of reconciling two worlds: It offers interoperability comparable to IronPython, while finally unlocking access to the modern Python ecosystem and its powerful C-based libraries (NumPy, Pandas, etc.).

For developers still relying on IronPython 2.7, it is time to migrate. Beyond the technical advantages of PythonNet3, continuing to use the old engine poses real risks:

  • Technical obsolescence: Python 2.7 has reached its end of life and is no longer maintained.
  • Security: The presence of known vulnerabilities (e.g., urllib2) that will never be patched.
  • Limitations: A total inability to use modern PyPI packages.
  • Future uncertainty: Increasingly uncertain compatibility with future versions of .NET (9+).

Over the past few months, part of my time has been spent migrating Python nodes from IronPython2 to PythonNet3. However, as of 2024, the PythonNet3 package was not yet finalized, so I used IronPython3 as an interim step. More than a hundred scripts have thus been converted.

This blog post serves as a technical guide for Dynamo users who use Python in Dynamo, detailing the major changes, the benefits of adopting PythonNet3, and solutions to common migration obstacles.

In addition to this article, I recommend you read this previous blog post by Trygve Wastvedt.

Over the past few months, part of my time has been spent migrating Python nodes from IronPython2 to PythonNet3. However, as of 2024, the PythonNet3 package was not yet finalized, so I used IronPython3 as an interim step. More than a hundred scripts have thus been converted.

My migration was therefore carried out as follows:

IronPython2 -> IronPython3 -> PythonNet3

Migration process

Recap of the versions available to date

Features IronPython 2.7 IronPython 3 Python.NET 2.5.x (alias CPython3) Python.NET 3.x
Principal Concept Implementing Python on .NET

(with DLR)

Implementing Python on .NET

(with DLR)

Gateway to CPython Gateway to CPython
Python Version Python 2.7 Python 3.4 (currently) Python 2.7,

Python 3.5-3.9

Python 3.7+ (modern)
.NET Support .NET Framework & .Net Core/

.NET 6/7/8

Uncertain compatibility with future versions of .NET

.NET Framework & .NET Core / .NET 5+ .NET Framework (mainly) .NET Framework & .NET Core / .NET 5+
Library Compatibility ⚠ Weak. Incompatible with Python libraries that use C extensions. ⚠ Weak. Same fundamental limitation as v2.7. ✅ Excellent. Access to the entire CPython ecosystem. ✅ Excellent. Access to the entire modern PyPI ecosystem (NumPy, Pandas, etc.).
Performance Good for pure .NET interoperability. Good for pure .NET interoperability. Depends on CPython. Slight overhead for calls. Depends on CPython. Slight overhead for calls, but highly optimized.
Status of Python Project ❌ Obsolete. No longer actively maintained. ✅ Active. The modern and recommended version if you choose IronPython. ❌ Obsolete. Replaced by version 3.x. ✅ Active. The industry standard for Python/.NET interoperability.

Observations on the previous CPython3 engine

The previous CPython3 engine was based on PythonNet 2.5 and had several limitations and drawbacks, including:

  1. Expensive conversion of .NET collections: The engine performed an automatic and implicit conversion of .NET collections to native Python lists. This approach was costly in performance due to a deep copy of the data. Additionally, changes to the Python list were not reflected in the original .NET object.
  2. Difficulty of implementing .NET Class Interfaces: In CPython3 (PythonNet 2.5), it was impossible to easily use .NET class interfaces.
  3. Problems with method overloading
  4. Binary and unary arithmetic operators not taken into account for C# operator methods

Iteration bug: A bug caused all instances of .NET classes to be considered Iterable, resolved in the newer version.

Migrating from Net Framework 4.x to Net Core

PythonNet3 is available from Revit 2025 as a package (called PythonNet3 Engine) and therefore runs under .NET 8+ (Core) instead of .NET Framework 4.x. This change has significant implications for interoperability and certain APIs.

Here are the most common examples I have noticed:

Access to the Global Assembly Cache (GAC)

It is now impossible to directly use the “assemblies” present in the Windows GAC (like Excel Interop or Word Interop).

Solution: Consider migrating to native Python solutions like openpyxl or the OpenXML SDK. If using Excel Interop is absolutely necessary, you must create an instance and use .NET Reflection to interact with it.

Example of using Reflection for Excel Interop (Excerpt):

import clr
import sys
import System
from System import Array
from System.Collections.Generic import List, Dictionary, IDictionary

clr.AddReference("System.Reflection")
from System.Reflection import BindingFlags

from System.Runtime.InteropServices import Marshal

clr.AddReference("System.Core")
clr.ImportExtensions(System.Linq)

xls_filePath = IN[0]
xls_SheetName = IN[1]
dict_values = {}

systemType = System.Type.GetTypeFromProgID("Excel.Application", True)
try:
    ex = System.Activator.CreateInstance(systemType)
except:
    methodCreate = next((m for m in clr.GetClrType(System.Activator)\
            .GetMethods() if "CreateInstance(System.Type)" in m.ToString()), None)
    ex = methodCreate.Invoke(None, (systemType, ))

ex.Visible = False
workbooks = ex.GetType().InvokeMember("Workbooks", BindingFlags.GetProperty ,None, ex, None)
workbook = workbooks.GetType().InvokeMember("Open", BindingFlags.InvokeMethod , None, workbooks, (xls_filePath, ))
worksheets = workbook.GetType().InvokeMember("Worksheets", BindingFlags.GetProperty ,None, workbook, None)
#
ws = worksheets.GetType().InvokeMember("Item", BindingFlags.GetProperty ,None, worksheets, (xls_SheetName,))

Changing Methods and Namespaces

Some Windows-specific or obsolete technologies have been removed or replaced by new implementations with new namespaces.

Accessing COM objects

The method Marshal.GetActiveObject() to get the running COM instance of a specified object is no longer available.

Solutions:

  1. Use BindToMoniker if you know the path of the file in use.
  2. Code a library in C# using the class structure Marshal.GetActiveObject() 

Example of using BindToMoniker:

import clr
import os
import time
import System

clr.AddReference("System.Reflection")
from System.Reflection import BindingFlags

clr.AddReference("AcMgd")
clr.AddReference("AcCoreMgd")
clr.AddReference("Autodesk.AutoCAD.Interop")

from System import *

from Autodesk.AutoCAD.Runtime import *
from Autodesk.AutoCAD.ApplicationServices import *
from Autodesk.AutoCAD.Interop import *
from Autodesk.AutoCAD.ApplicationServices import Application as acapp

changeViewCommand = "_VIEW "

adoc = Application.DocumentManager.MdiActiveDocument
currentFileName = adoc.Name
print(currentFileName)
com_doc = System.Runtime.InteropServices.Marshal.BindToMoniker(currentFileName)
args = System.Array[System.Object]([changeViewCommand])
com_doc.GetType().InvokeMember("SendCommand", BindingFlags.InvokeMethod, None, com_doc, args)

OUT = True

Migrating from IronPython2 to PythonNet3

Explicitly calling the base class constructor

Now, in Python classes that inherit from a .NET type (like Winform, WPF, DataTable, or an Interface), if you overload the method __init__, you must explicitly call the base class constructor using super().__init__(...).

Example (WinForms):

class TestForm(Form):
    def __init__(self):
        super().__init__() # add this line
        self.Font  = System.Drawing.SystemFonts.DefaultFont
        self.InitializeComponent()
   
    def InitializeComponent(self):
        self._buttonCancel = System.Windows.Forms.Button()
        self._buttonOK = System.Windows.Forms.Button()
        self.SuspendLayout()

Specific Syntax of .NET Class Interface Implementation

Implementing .NET Class Interfaces, which was difficult under CPython3 (PythonNet 2.5), is now fixed and made easier with Dynamo_PythonNet3.

A Python class deriving from a .NET class must have the attribute __namespace__.

Example (Implementation of ISelectionFilter):

class Custom_SelectionElem(ISelectionFilter):
    __namespace__ = "SelectionNameSpace_tEfYX0DHE"
    #
    def __init__(self, bic):
        super().__init__() # necessary  if you override the __init__ method
        self.bic = bic
       
    def AllowElement(self, e):
        if e.Category.Id == ElementId(self.bic):
            return True
        else:
            return False
    def AllowReference(self, ref, point):
        return True

#        

class Custom_FamilyOption(IFamilyLoadOptions) :
    __namespace__ = "FamilyOptionNameSpace_tEfYX0DHE"

    def __init__(self):
        super().__init__() # necessary  if you override the __init__ method
       
    def OnFamilyFound(self, familyInUse, _overwriteParameterValues):
        overwriteParameterValues = True
        return (True, overwriteParameterValues)

    def OnSharedFamilyFound(self, sharedFamily, familyInUse, source, _overwriteParameterValues):
        overwriteParameterValues = True     
        return (True, overwriteParameterValues)

The UI with WPF

IronPython allows the direct use of WPF thanks to a specific library (wpf ).

There is no similar library with PythonNet3. However, with some concessions on Binding, it is still possible to use WPF (even with the MVVM pattern) via XamlReader.Load(StringReader(xaml))

Here is an example of assigning a sub-project by Element type using the MVVM pattern:

 

import clr
import sys
import System
from System.Collections.ObjectModel import ObservableCollection
#import Revit API
clr.AddReference('RevitAPI')
import Autodesk
from Autodesk.Revit.DB import *
import Autodesk.Revit.DB as DB

clr.AddReference('RevitServices')
import RevitServices
from RevitServices.Persistence import DocumentManager
from RevitServices.Transactions import TransactionManager

#Get Important vars
doc = DocumentManager.Instance.CurrentDBDocument
uidoc = DocumentManager.Instance.CurrentUIApplication.ActiveUIDocument
uiapp = DocumentManager.Instance.CurrentUIApplication
app = uiapp.Application
sdkNumber = int(app.VersionNumber)

clr.AddReference('System.Data')
from System.Data import *

clr.AddReference("System.Xml")
clr.AddReference("PresentationFramework")
clr.AddReference("System.Xml")
clr.AddReference("PresentationCore")
clr.AddReference("System.Windows")
import System.Windows.Controls
from System.Windows.Controls import *
from System.IO import StringReader
from System.Xml import XmlReader
from System.Windows import LogicalTreeHelper
from System.Windows.Markup import XamlReader, XamlWriter
from System.Windows import Window, Application
from System.ComponentModel import INotifyPropertyChanged, PropertyChangedEventArgs

import time
import traceback
import itertools


class ViewModel(INotifyPropertyChanged): # INotifyPropertyChanged
    __namespace__ = "ViewModel_jhggsbUbwQpY" # rename it each edition class
    def __init__(self, elem_type, lst_Workset):
        super().__init__()
        self._elem_type = elem_type
        self._SelectValue = lst_Workset[0] # set default workset
        self._lst_Workset = ObservableCollection[DB.Workset](lst_Workset)
        #
        self._property_changed_handlers = []
        self.PropertyChanged = None
       
    @clr.clrproperty(DB.Element)
    def ElementType(self):
        return self._elem_type
   
    @clr.clrproperty(System.String)
    def Name(self):
        return self._elem_type.get_Name()
       
    @clr.clrproperty(System.String)
    def FamilyName(self):
        return self._elem_type.FamilyName
       
    def get_SelectValue(self):
        return self._SelectValue
    def set_SelectValue(self, value):
        if self._SelectValue != value:
            self._SelectValue = value
            self.OnPropertyChanged("SelectValue")
    # Add SelectValue as a clr property
    SelectValue = clr.clrproperty(DB.Workset, get_SelectValue, set_SelectValue)
      
    @clr.clrproperty(ObservableCollection[DB.Workset])
    def LstWorkset(self):
        return self._lst_Workset
       
    def OnPropertyChanged(self, property_name):
        event_args = PropertyChangedEventArgs(property_name)
        for handler in self._property_changed_handlers:
            handler(self, event_args)

    # Implementation of add/remove_PropertyChanged
    def add_PropertyChanged(self, handler):
        if handler not in self._property_changed_handlers:
            self._property_changed_handlers.append(handler)

    def remove_PropertyChanged(self, handler):
        if handler in self._property_changed_handlers:
            self._property_changed_handlers.remove(handler)
   


class MainWindow(Window):
    string_xaml = '''
<Window
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
        xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
        Title="Selection"
        Height="700" MinHeight="700"
        Width="700" MinWidth="780"
        x:Name="MainWindow">
        <Window.Resources>
        </Window.Resources>
        <Grid Width="auto" Height="auto">
            <Grid.RowDefinitions>
                <RowDefinition Height="30" />
                <RowDefinition />
                <RowDefinition Height="60" />
            </Grid.RowDefinitions>
            <Label
                x:Name="label1"
                Content="Selection"
                Grid.Column="0" Grid.Row="0"
                HorizontalAlignment="Left" VerticalAlignment="Bottom"
                Margin="8,0,366.6,5"
                Width="415" Height="25" />
            <DataGrid
                x:Name="dataGrid"
                AutoGenerateColumns="False"
                ItemsSource="{Binding}"
                Grid.Column="0" Grid.Row="1"
                HorizontalAlignment="Stretch" VerticalAlignment="Stretch"
                Margin="8,3,8,7"
                SelectionUnit="Cell"
                CanUserAddRows="False">
                <DataGrid.Columns>
                    <DataGridTextColumn Header="Family Name" Binding="{Binding FamilyName}" Width="*" />
                    <DataGridTextColumn Header="Name" Binding="{Binding Name}" Width="*" />
                    <DataGridTextColumn Header="Category" Binding="{Binding ElementType.Category.Name}" Width="*" />
                    <DataGridTemplateColumn Header="Workset">
                        <DataGridTemplateColumn.CellTemplate>
                            <DataTemplate>
                                <ComboBox x:Name="Combobox"
                                    ItemsSource="{Binding LstWorkset}"
                                    DisplayMemberPath="Name"
                                    SelectedItem="{Binding SelectValue, Mode=TwoWay, UpdateSourceTrigger=PropertyChanged}"
                                    Width="200"/>
                            </DataTemplate>
                        </DataGridTemplateColumn.CellTemplate>
                    </DataGridTemplateColumn>
                </DataGrid.Columns>
            </DataGrid>
            <Button
                x:Name="buttonCancel"
                Content="Cancel"
                Grid.Column="0" Grid.Row="2"
                HorizontalAlignment="Left" VerticalAlignment="Bottom"
                Margin="18,13,0,10"
                Height="30" Width="120">
            </Button>
            <Button
                x:Name="buttonOK"
                Content="OK"               
                Grid.Column="0" Grid.Row="2"
                HorizontalAlignment="Right" VerticalAlignment="Bottom"
                Margin="0,12,22,10"
                Height="30" Width="120">
            </Button>
        </Grid>
</Window>'''
 
    def __init__(self, lst_wkset, lst_elems):
        super().__init__()
        self._lst_wkset = lst_wkset
        self._lst_elems = lst_elems
        self._set_elemTypeId = set(x.GetTypeId() for x in lst_elems if isinstance(x, FamilyInstance))
        self._lst_elemType = [doc.GetElement(xId) for xId in self._set_elemTypeId if xId != ElementId.InvalidElementId]
        #
        #sort _lst_elemType by Name   
        self._lst_elemType= sorted(self._lst_elemType, key = lambda x : x.FamilyName)
        #
       # Create an ObservableCollection of MyDataViewModel objects
        self.data = ObservableCollection[System.Object]()
        for elem in self._lst_elemType:
            self.data.Add(ViewModel(elem, self._lst_wkset))
        #
        self.pairLst = []
        #
        xr = XmlReader.Create(StringReader(MainWindow.string_xaml))
        self.winLoad = XamlReader.Load(xr)
        self.InitializeComponent()
       
    def InitializeComponent(self):
        try:
            self.Content = self.winLoad.Content
            #
            self.dataGrid = LogicalTreeHelper.FindLogicalNode(self.winLoad, "dataGrid")
            #
            self.buttonCancel = LogicalTreeHelper.FindLogicalNode(self.winLoad, "buttonCancel")
            self.buttonCancel.Click += self.ButtonCancelClick
            #
            self.buttonOK = LogicalTreeHelper.FindLogicalNode(self.winLoad, "buttonOK")
            self.buttonOK.Click += self.ButtonOKClick
            #
            self.winLoad.Loaded += self.OncLoad
            #
            self.dataGrid.DataContext = self.data               # Set DataContext
            self.winLoad.DataContext = self.data

        except Exception as ex:
            print(traceback.format_exc())

    def OncLoad(self, sender, e):
        print("UI loaded")

    def ButtonCancelClick(self, sender, e):
        self.outSelection = []
        self.winLoad.Close()

    def ButtonOKClick(self, sender, e):
        try:
            # get result from input Data (Binding)
            self.pairLst = [[x.ElementType, x.SelectValue] for x  in self.data]
            self.winLoad.Close()
        except Exception as ex:
            print(traceback.format_exc())

lst_Elements = UnwrapElement(IN[0])
lst_Wkset = FilteredWorksetCollector(doc).OfKind(WorksetKind.UserWorkset).ToWorksets()
objWindow = MainWindow(lst_Wkset, lst_Elements)
objWindow.winLoad.ShowDialog()

OUT = objWindow.pairLst 

Respect method signatures

IronPython is more permissive. With PythonNet, you must strictly adhere to the signature of a .NET method, which means that:

  • The method name is correct.
  • The method is called on the correct type of object (instance method vs static method).
  • The types of objects passed as arguments are correct.

If you pass a native Python list to a .NET method that expects a .NET collection, you must explicitly cast/convert the Python list.

Example: Casting a Python list to List<ElementId> :

from System.Collections.Generic import List
# some imports
selelemz=List[ElementId][ElementId(124291), ElementId(124292), ElementId(124293)])
elements=FilteredElementCollector(doc,selelemz).WhereElementIsNotElementType().ToElements()

Specific syntax for out and ref parameters

With PythonNet3, the parameters out or ref appear as normal arguments in Python, but the return value of the method is modified: The method returns the modified value(s) of these parameters as a tuple.

Example (Using ComputeClosestPoints with a parameter out):

IronPython (old syntax):

curvA = cableTray1.Location.Curve
curvB = cableTray2.Location.Curve
outrefClosest = clr.Reference[IList[ClosestPointsPairBetweenTwoCurves]](List[ClosestPointsPairBetweenTwoCurves]())
curvA.ComputeClosestPoints(curvB, True, True, True, outrefClosest)
listOfPoint = [[x.XYZPointOnFirstCurve.ToPoint(), x.XYZPointOnSecondCurve.ToPoint()] for x in outrefClosest.Value]

PythonNet3 (new syntax):

curvA = cableTray1.Location.Curve
curvB = cableTray2.Location.Curve
outrefClosest = IList[ClosestPointsPairBetweenTwoCurves](List[ClosestPointsPairBetweenTwoCurves]())
outrefClosestA = curvA.ComputeClosestPoints(curvB, True, True, True, outrefClosest)
listOfPoint = [[x.XYZPointOnFirstCurve.ToPoint(), x.XYZPointOnSecondCurve.ToPoint()] for x in outrefClosestA]

Please note that there is a difference with the old version “PythonNet2.5” when the method returns nothing (Void)

Access to CPython libraries coded in C

PythonNet3 provides excellent access to the entire modern PyPI ecosystem, including libraries that use C extensions (such as numpy, pandas, scipy, openpyxl). IronPython has an incompatibility with these libraries.

LINQ

Just as IronPython PythonNet3 supports LINQ method extensions, it will nevertheless be necessary to specify the Types of the objects passed as arguments. See examples in the next section.

No direct access to COM objects

PythonNet does not implement DLR, unlike IronPython. As a result, dynamic access directly to COM object properties is not possible.

Two workarounds:

  • Use .NET Reflection
  • Cast objects to the correct interface

Example of using .NET Reflection:

import sys
import clr
import System
from System import Environment
from System.Runtime.InteropServices import Marshal
try:
    from System.Reflection import BindingFlags
except:
    clr.AddReference("System.Reflection")
    from System.Reflection import BindingFlags

xls_filePath = IN[0]
wsNames = []

systemType = System.Type.GetTypeFromProgID("Excel.Application", True)
try:
    ex = System.Activator.CreateInstance(systemType)
except:
    methodCreate = next((m for m in clr.GetClrType(System.Activator)\
                    .GetMethods()if "CreateInstance(System.Type)" in m.ToString()), None)
    ex = methodCreate.Invoke(None, (systemType, ))
   
ex.Visible = False
workbooks = ex.GetType()\
            .InvokeMember("Workbooks", BindingFlags.GetProperty ,None, ex, None)
#
workbook = workbooks.GetType()\
            .InvokeMember("Open", BindingFlags.InvokeMethod , None, workbooks, (xls_filePath, ))
#
worksheets = workbook.GetType()\
            .InvokeMember("Worksheets", BindingFlags.GetProperty ,None, workbook, None)
#
enumerator_sheets = worksheets.GetType()\
            .InvokeMember("GetEnumerator", BindingFlags.InvokeMethod , None, worksheets, None)
#
while enumerator_sheets.MoveNext():
    sheet = enumerator_sheets.Current
    sheet_name = sheet.GetType().InvokeMember("Name", BindingFlags.GetProperty,None, sheet, None)
    wsNames.append(sheet_name)

workbooks.GetType().InvokeMember("Close", BindingFlags.InvokeMethod, None, workbooks, None)
ex.GetType().InvokeMember("Quit", BindingFlags.InvokeMethod, None, ex, None)

if workbooks is not None:
    Marshal.ReleaseComObject(workbooks)
if ex is not None:
    Marshal.ReleaseComObject(ex)
#
workbooks = None
ex = None
#
OUT = wsNames

Migrating from CPython3 (PythonNet2.5) to PythonNet3

This migration is simpler but involves fundamental changes in the management of .NET types.

Of course, everything mentioned in the previous section must be taken into consideration.

PythonNet3 aims to bring the “Python in Dynamo” experience closer to that offered by IronPython, while leveraging the CPython ecosystem (NumPy, Pandas, etc.). It is the industry standard for Python/.NET interoperability.

Major Benefits and Improvements

Features PythonNet3 (Python 3.7+ / Python.NET v3) Comments
CPython Ecosystem Excellent access to the entire modern PyPI ecosystem (NumPy, Pandas, etc.).

 

.NET Collections Mechanism Automatic conversion is removed. The object is a “view” of the .NET collection, without copying data, which improves performance.

However, they implement the standard Python collections interfaces of collections.abc.

 

IEnumerable PyObject now implements IEnumerable in addition to IEnumerable<T>. A bug where all .NET class instances were treated as Iterable has been fixed.
(ex:  hasattr(pyObject, "__iter__") )

 

LINQ Ability to use LINQ method extensions on IEnumerable<T>. Dynamo feature
Method overloading Support has been improved, including for methods with generic type parameters (<T>).

 

C# Operators Python’s binary and unary arithmetic operators now call the corresponding C# operator methods.

 

.NET Class Interfaces Simplified support for .NET Class Interfaces. Dynamo feature
Out or ref parameters You can now overload .NET methods in Python that use ref and out parameters. To do this, you need to return the values ​​of these modified parameters as a tuple.

 

C# Arithmetic Operators Python’s binary and unary arithmetic operators now call the corresponding C# operator methods.

 

Heritage and Builders If you overload the method __init__ of a .NET type in Python, you must now explicitly call the base class constructor using super().__init__(...).

 

Conversions of enumerations Implicit conversion between C# enums and Python integers is disabled. You must now use enum members (e.g.,MonEnum.Option ). Additionally, the .NET method Enum.Value.ToString() now returns the value name instead of its integer. similar to IronPython

Details on some points

Implementing .NET Class Interfaces

In CPython3 (PythonNet2.5), it is impossible to easily use .NET class interfaces. This is now fixed with Dynamo_PythonNet3.

Notes:

  • A Python class that derives from a .NET class must have the attribute __namespace__
  • The class Custom_FamilyOption includes an example with ref and out parameters returning as a tuple
 class Custom_SelectionElem(ISelectionFilter):
    __namespace__ = "SelectionNameSpace_tEfYX0DHE"
    #
    def __init__(self, bic):
        super().__init__() # necessary  if you override the __init__ method
        self.bic = bic
       
    def AllowElement(self, e):
        if e.Category.Id == ElementId(self.bic):
            return True
        else:
            return False
    def AllowReference(self, ref, point):
        return True

#        

class Custom_FamilyOption(IFamilyLoadOptions) :
    __namespace__ = "FamilyOptionNameSpace_tEfYX0DHE"

    def __init__(self):
        super().__init__() # necessary  if you override the __init__ method
       
    def OnFamilyFound(self, familyInUse, _overwriteParameterValues):
        overwriteParameterValues = True
        return (True, overwriteParameterValues)

    def OnSharedFamilyFound(self, sharedFamily, familyInUse, source, _overwriteParameterValues):
        overwriteParameterValues = True     
        return (True, overwriteParameterValues)

No conversion of .NET Collections and Arrays

This is the major change from CPython3 (PythonNet 2.5), which performed automatic implicit conversion.

With PythonNet3:

  1. Automatic conversion is removed.
  2. .NET collections and arrays now implement the standard Python collections interfaces (collections.abc).
  3. The .NET object behaves like a “view”, which is efficient because there is no data copying.
  4. You can use .NET methods like LINQ directly on these objects.
  5. To use methods specific to Python lists (like append()) or bracket indexing, an explicit conversion via list() is required.

Consequences on the indexing of IEnumerable<T>:

Direct indexing [index] is impossible on types IEnumerable returned by certain methods.

Here is an example with PythonNet3 where we cannot use indexing because the method face.ToProtoType() returns an IEnumerable and not a IList<T>.

Old (or incorrect) code:

element = UnwrapElement(IN[0])

ref = HostObjectUtils.GetSideFaces(element, ShellLayerType.Exterior)[0] # GetSideFaces return an Ilist so we can use indexer
face = element.GetGeometryObjectFromReference(ref)
ds_surface = face.ToProtoType()[0] # ToProtoType() on Revit face return an IEnumerable so we can't use indexer

New code (Using LINQ or converting):

clr.AddReference("System.Core")
clr.ImportExtensions(System.Linq)

element = UnwrapElement(IN[0])

ref = HostObjectUtils.GetSideFaces(element, ShellLayerType.Exterior)[0] # GetSideFaces return an Ilist so we can use indexer
face = element.GetGeometryObjectFromReference(ref)
ds_surface = face.ToProtoType().First() # ToProtoType() on Revit face return an IEnumerable so we can use LINQ
# OR convert to python list
ds_surface = list(face.ToProtoType())[0] # ToProtoType() on Revit face return an IEnumerable so we need convert to python list to user indexer
OUT = ds_surface

LINQ

Advice:

  • When passing a lambda function as a function parameter, it must be explicitly converted, for example to System.Func[<input_type>, <output_type>].
  • Some extension libraries still don’t work, like DataTableExtensions the .NET.

Example of using LINQ extension methods

import sys
import clr
import System
from System.Collections.Generic import List, IList

clr.AddReference("System.Reflection")
from System.Reflection import Assembly

clr.AddReference('RevitAPI')
import Autodesk
from Autodesk.Revit.DB import *
import Autodesk.Revit.DB as DB


clr.AddReference('RevitServices')
import RevitServices
from RevitServices.Persistence import DocumentManager
doc = DocumentManager.Instance.CurrentDBDocument

clr.AddReference("System.Core")
clr.ImportExtensions(System.Linq)
#

resultB = FilteredElementCollector(doc).OfClass(FamilySymbol).WhereElementIsElementType()\
            .Where(System.Func[DB.Element, System.Boolean](lambda e_type : "E1" in e_type.Name))\
            .ToList()
           
resultC = FilteredElementCollector(doc).OfClass(FamilySymbol).WhereElementIsElementType()\
            .FirstOrDefault(System.Func[DB.Element, System.Boolean](lambda e_type : "E1" in e_type.Name))
           
resultD = FilteredElementCollector(doc).OfClass(FamilySymbol).WhereElementIsElementType()\
            .FirstOrDefault(System.Func[DB.Element, System.Boolean](lambda e_type : "E12" in e_type.Name))
           
resultGroup = FilteredElementCollector(doc).OfClass(FamilySymbol).WhereElementIsElementType()\
            .GroupBy[DB.Element, System.String](System.Func[DB.Element, System.String](lambda e_type :  e_type.Name))\
            .Select[System.Object, System.Object](System.Func[System.Object, System.Object](lambda x : x.ToList()))
           
           
assemblies = System.AppDomain.CurrentDomain.GetAssemblies()
assemblies_name = assemblies\
    .OrderBy[Assembly, System.String](System.Func[Assembly, System.String](lambda x : x.GetName().Name))\
    .Select[Assembly, System.String](System.Func[Assembly, System.Int32, System.String](lambda asm, index : f"{index + 1} : {asm.GetName().Name}"))
           
print(assemblies_name.GetType())
OUT = resultB.ToList(), resultC, resultD, resultGroup.ToList(), assemblies_name

Obstacles encountered with PythonNet3 and workarounds

Properties of type “Indexer”

Properties of type “Indexer” (like access to Space in the Revit API) are no longer handled directly with the bracket syntax. The indexer must be replaced by the corresponding get_  method.

Example:

Old code:

space = elem.Space[phase]

New code:

space = elem.get_Space(phase)

Error in Python set collection

The Python Collection set() does not accept objects of type ElementId.InvalidElementId.

Workarounds:

  • Filter invalid IDs when building the set().
  • Use the .NET class HashSet<T>.

Example (Filtering):

setViewsIds = set([w.OwnerViewId for In in allWires if w.OwnerViewId != ElementId.InvalidElementId])

“with” statement on .NET objects

The use of the keyword with on .NET objects (which implement the interface IDisposable) currently throws an error with PythonNet3, unlike IronPython.

Workaround: Build a custom context manager to explicitly handle the method Dispose().

Example (Custom Context Manager CManager):

import sys
import clr
import System
# Add Assemblies for AutoCAD and Civil3D
clr.AddReference('AcMgd')
clr.AddReference('AcCoreMgd')
clr.AddReference('AcDbMgd')
clr.AddReference('AecBaseMgd')
clr.AddReference('AecPropDataMgd')
clr.AddReference('AeccDbMgd')

# Import references from AutoCAD
from Autodesk.AutoCAD.Runtime import *
from Autodesk.AutoCAD.ApplicationServices import *
from Autodesk.AutoCAD.EditorInput import *
from Autodesk.AutoCAD.DatabaseServices import *
from Autodesk.AutoCAD.Geometry import *

# Import references from Civil3D
from Autodesk.Civil.ApplicationServices import *
from Autodesk.Civil.DatabaseServices import *

# The inputs to this node will be stored as a list in the IN variables.
dataEnteringNode = IN

adoc = Application.DocumentManager.MdiActiveDocument
editor = adoc.Editor

class CManager:
    """
    a custom context manager for Disposable Object
    """
    def __init__(self, obj):
        self.obj = obj
       
    def __enter__(self):
        return self.obj
       
    def __exit__(self, exc_type, exc_value, exc_tb):
        self.obj.Dispose()
        if exc_type:
            error = f"{exc_value} at line {exc_tb.tb_lineno}"
            raise ValueError( error)
        return self
             
        
dynCADObjects = IN[0]
with adoc.LockDocument():
    with CManager(adoc.Database) as db:
        print(db)
        with CManager(db.TransactionManager.StartTransaction()) as t:
            print(t)
            for dynObj in dynCADObjects:
                ent = t.GetObject(dynObj.AcadObjectId, OpenMode.ForWrite)
                ent.Layer = "Layer1"
                #a = 10 /0
            t.Commit()

print(f"{db.IsDisposed=}")
print(f"{t.IsDisposed=}")

.NET Objects that Inherit from an Interface

In the case of objects that inherit from a .NET Interface, it may be necessary to cast (convert) the object to that interface to use the inherited methods.

Example (Call BeginInit() on a PictureBox):
In the example below, you will find the PythonNet syntax to perform an explicit cast to a .NET interface.

class Form8(Form):
    def __init__(self):
        super().__init__() # necessary  if you override the __init__ method
        self.InitializeComponent()
   
    def InitializeComponent(self):
        self._pictureBox1 = System.Windows.Forms.PictureBox()
        System.ComponentModel.ISupportInitialize(self._pictureBox1).BeginInit()
        self.SuspendLayout()

The line System.ComponentModel.ISupportInitialize(self._pictureBox1).BeginInit() is therefore a method call on an explicit interface:

  • It “casts” (or rather, it accesses) the object self._pictureBox1 via its implementation of the interface ISupportInitialize.
  • It then calls the method BeginInit() of this interface

Attributes lost on .NET Class Instances when assigning to OUT

Python class instances deriving from a .NET class may have their attributes removed by the Dynamo Wrapper when assigning to the variable OUT.

Workaround:

Wrap the .NET object in a simple Python class before assigning it to OUT.

Example:

# Load the Python Standard and DesignScript Libraries
import sys
import clr

clr.AddReference('System.Drawing')
clr.AddReference('System.Windows.Forms')
import System.Drawing
import System.Windows.Forms

from System.Drawing import *
from System.Windows.Forms import *

class WrappNetObj:
     def __init__(self, obj):
         self.InnerObject = obj
     def __repr__(self):
         return "WrappNetObj_" + self.InnerObject.GetType().ToString()

class Form8(Form):
    def __init__(self):
        super().__init__()
        self.out_value = 7
        self.InitializeComponent()
   
    def InitializeComponent(self):
        self.SuspendLayout()
        #
        #Form8
        #
        self.ClientSize = System.Drawing.Size(284, 261)
        self.Name = "Form8"
        self.Text = "Form8"
        self.ResumeLayout(False)

win_obj = Form8()
OUT = WrappNetObj(win_obj)

 

Missing error line

In rare cases, the error line number in the Python node error message may be missing.

Here are some solutions while waiting for a fix:

  • Implement a try-except block with the traceback  library and print the error.
  • Implement a try-except block with a logger  and write the error.
  • Implement a debugger, for example, using sys.settrace() .

Conclusion

PythonNet3 establishes itself not only as the current standard but as the only sustainable path forward for development in Dynamo. I hope this migration guide has served as a helpful resource as you prepare to enter this new era of Python in Dynamo.

About the author 

The post Dynamo PythonNet3 Upgrade: A Practical Guide to Migrating Your Dynamo Graphs appeared first on Dynamo BIM.

]]>
https://dynamobim.org/dynamo-pythonnet3-upgrade-a-practical-guide-to-migrating-your-dynamo-graphs/feed/ 0
Dynamo for Civil 3D 2026.2 Release https://dynamobim.org/dynamo-for-civil-3d-2026-2-release/ Tue, 02 Dec 2025 17:46:56 +0000 https://dynamobim.wpenginepowered.com/?p=56445 Hey Civil 3D users, It’s hard to believe that 2025 is almost over! But before we launch off into next year, we wanted to close out 2025 by giving you a new release of Civil 3D to play with during the holidays. The Civil 3D 2026.2 update is now live, and it comes with a …

The post Dynamo for Civil 3D 2026.2 Release appeared first on Dynamo BIM.

]]>
Hey Civil 3D users,

It’s hard to believe that 2025 is almost over! But before we launch off into next year, we wanted to close out 2025 by giving you a new release of Civil 3D to play with during the holidays. The Civil 3D 2026.2 update is now live, and it comes with a whole bunch of great new features to explore, including improvements to the Drainage Tools for Civil 3D, feature line editing, horizontal regression analysis, Project Explorer, alignments, and more. You can read all about it in the Civil 3D 2026.2 release notes.

And of course, there are lots of new features in Dynamo as well! Most notably, there are 75+ new nodes for drainage objects that enable workflow automation for ponds, channels, and underground storage. On top of that, we’ve also upgraded Dynamo to version 3.6, as well as made several smaller improvements and fixes.

To get all of this great new stuff right now, download and install the Civil 3D 2026.2 update using the Autodesk Access desktop app, or download it by signing in to your Autodesk account on manage.autodesk.com.

TL:DR

Dynamo for Civil 3D 2026.2 includes several groups of new nodes for stormwater control objects, which can be used to boost efficiency for drainage design and analysis workflows. The 75+ new nodes can be found under the “Drainage” section of the Civil 3D node library. These nodes enable you to automate workflows involving pond design, configuration of channel hydraulic parameters, storage curve analysis, and more.

Dynamo Core has been updated to version 3.6, bringing you a revamped and more discoverable Node Autocomplete experience, enhancements to groups, and bug fixes. In addition, it adds significant performance updates that are all about turbocharging your Dynamo navigation experience, making it faster and more responsive, thus minimizing your wait time.

So, what’s dropping in Dynamo for Civil 3D 2026.2?

Automation for drainage design

Have you heard about the new Drainage Tools for Civil 3D? This toolset is designed to integrate the drainage design environment of Civil 3D with analysis capabilities powered by InfoDrainage analysis cloud services, all without leaving Civil 3D. This enhances drainage design capabilities by providing new workflows for the design of ponds, underground storage facilities, and open channels, as well as improved catchment delineation with expanded runoff method support. Previously, the toolset has been available as a plugin that you could download and install separately. But now, the Drainage Tools for Civil 3D are included and installed with Civil 3D 2026.2 by default!

Not only do you get these great tools without a separate installation, but we’ve also added 75+ new nodes for drainage objects that enable workflow automation for ponds, channels, and underground storage. You can find these nodes under the “Drainage” shelf in the Civil 3D node library.

These nodes open up huge possibilities for automating drainage design workflows. For example:

  • Automate storage facility design using the Pond.ByContours, Pond.ByPolyCurve, and UndergroundStorage.ByPolyCurve nodes
  • Create open channels in bulk using the Channel.ByAlignmentProfile and Channel.ByFeatureLine nodes
  • Configure hydraulic parameters like Manning’s coefficient, infiltration rates, channel entry and exit losses, and more
  • Query storage curve data for stormwater control facilities

To give you a quick example of what you can do with these nodes, here’s an example of using the Curve Mapper node to experiment with different pond shapes that fit within a boundary. Super cool!

Dynamo Core 3.6

We’re continually making improvements to Civil 3D, and part of that work includes keeping Dynamo Core updated so you always have access to a fresh Dynamo release within Civil 3D. Civil 3D 2026.2 includes Dynamo Core 3.6, which introduces a completely revamped Node Autocomplete experience. By simply hovering over any port on a node, you’ll see a purple icon that triggers a new Node Autocomplete toolbar when clicked. Node Autocomplete is an efficiency-boosting feature that, similar to how text processing tools suggest the next word, offers several node suggestions to connect to your selected port. These machine learning-powered node suggestions have the potential to vastly speed up your graph-building workflow, minimizing the need to spend time searching for the right node via trial and error.

In addition, Dynamo Core 3.6 includes enhancements to groups and significant performance updates that are all about turbocharging your Dynamo navigation experience, making it faster and more responsive, thus minimizing your wait time. We won’t repeat everything here, but please do check out the Dynamo 3.6 release post.

Smaller buckets of work

There are also a few smaller items related to catchments that you should know about:

  • There’s a new node called Catchment.SetRunoffMethod that you can use to modify the runoff method of a catchment
  • We also added a few more dropdown nodes to make it easier to select things:
    • Choose Runoff Method
    • Choose Catchment Group Name
    • Choose Catchment Name

Bug fixes

As always, we’re continuously working to improve the quality of Dynamo for Civil 3D, and we were able to fix a few bugs in this release. Thanks again to those of you that brought these issues to our attention. It is greatly appreciated! If you notice something that seems off while using Dynamo, please don’t hesitate to let us know.

Normally we just list out the bugs here, but this one deserves a little more attention. For those of you that have used Python nodes to interact directly with the AutoCAD and Civil 3D APIs, you’ve probably encountered cases where Dynamo would crash after making a simple spelling error or some other minor mistake. This was a longstanding bug that made for a not-so-great experience when experimenting with Python. The good news is that the issue has been fixed in this release! Apologies for the inconvenience, and we hope that your experience is much better moving forward.

Now for the rest of the bugs that we fixed:

  • Fixed an issue in which the PipeNetwork.ShortestPathBetweenParts node would return Part objects instead of Pipe and Structure objects.
  • Fixed a stability issue when using Node Autocomplete.
  • Fixed a stability issue that could occur with Python scripts with invalid API calls.
  • Fixed an issue in which nodes for retrieving the reference surface of an object would return the wrong surface type.

What’s next?

There are some very cool things in the works both for Dynamo and for Civil 3D. Be sure to stay tuned to the public roadmaps for both Civil Infrastructure and Dynamo Core, where you can take a look at current and upcoming work, express your support for features you want to see, and let us know what you think by adding a comment.

Happy Holidays!

The Autodesk Civil Infrastructure Team

The post Dynamo for Civil 3D 2026.2 Release appeared first on Dynamo BIM.

]]>
AI-Powered Automation: Leveraging Dynamo’s Autodesk Assistant (Alpha) to Place As-Built Weir Stones into a Revit Model https://dynamobim.org/ai-powered-automation-leveraging-dynamos-autodesk-assistant-alpha-to-place-as-built-weir-stones-into-a-revit-model/ https://dynamobim.org/ai-powered-automation-leveraging-dynamos-autodesk-assistant-alpha-to-place-as-built-weir-stones-into-a-revit-model/#respond Mon, 01 Dec 2025 14:32:53 +0000 https://dynamobim.wpenginepowered.com/?p=56419 Dynamo is valuable tool for automating processes in Civil 3D and Revit—especially processes that are repeatable or require revision in the future. While it offers significant time-savings, designers can find it difficult to build Dynamo graphs and create these complex automations. What if there was a way to automate Dynamo itself? Now with Autodesk Assistant …

The post AI-Powered Automation: Leveraging Dynamo’s Autodesk Assistant (Alpha) to Place As-Built Weir Stones into a Revit Model appeared first on Dynamo BIM.

]]>
Dynamo is valuable tool for automating processes in Civil 3D and Revit—especially processes that are repeatable or require revision in the future. While it offers significant time-savings, designers can find it difficult to build Dynamo graphs and create these complex automations.

What if there was a way to automate Dynamo itself? Now with Autodesk Assistant in Dynamo, currently available as part of an Alpha, this is a reality.

In this blog post, I’m going to share my process of using Autodesk Assistant to fully build out complex Dynamo graphs to automate my modeling process for placing as-built weir stones into a Revit model for the construction of a fish passage. I’ll explain the process, including some pitfalls and how I overcame them, and share the prompts I used to direct Autodesk Assistant through graph creation. I’ll also share some tips for making the most out of Autodesk Assistant in Dynamo.

Project Background and Design Context

The project involves the addition of a fish passage to an existing navigable lock and dam. The lock and dam maintain different water pool elevations to create larger slack water pools for marine navigation and freight transport. However, these large structures become a barrier that eliminates the ability for native fish to navigate from pool to pool. We are tasked with constructing a fish passage to address this problem. This massive fish passage has many pools and ridges as elevation slightly increases, constructed of various aggregate layers. On top of each ridge are large weir stones to help dissipate energy from fast-moving water, creating calmer pockets that allow fish to rest as they navigate upstream.

Partial fish passage completion

Before Autodesk Assistant: Early Dynamo Workflow

Initially, Autodesk Assistant was not yet available in Dynamo. My objective was to create a Revit model for the field crews to use, based on the IFC plans. The design files showed there would be approximately 881 weir stones placed along the fish passage. I created a Dynamo workflow to place a generic weir stone at a Coordinate Geometry (COGO) point derived from Civil 3D. Additional COGO points were utilized to maintain the proper alignments between weir stones on a given ridge within the Revit model. This worked great: With one click of a button, it placed all the weir stones exactly as the plans had stated.

Initial Revit Model

Adapting for As-Built Conditions

As construction began, the actual weir stones from the quarry varied greatly in size and were mostly larger than the design had called out. To accommodate this change, I needed to rebuild the Revit model based on as-built data from our survey team. As construction proceeded, the survey team provided coordinates for each stone’s center of mass and corresponding weight. This data formed the foundation of creating a digital twin of the fish passage.

Utilizing Autodesk Assistant Alpha in Dynamo for Civil 3D

Here’s where Autodesk Assistant became essential. I uploaded the COGO data into Civil 3D and manually created some alignments along each wave in the passage (point to point at each COGO point/weir stone along a wave).

From there I used Autodesk Assistant in Dynamo to collect the data within Civil 3D and export as a JSON. This workflow using Autodesk Assistant was quite straightforward using simple prompts such as:

  • “Extract the COGO data from points group ‘weir stone actual placement’ and create a list to use for placement in Revit”.
  • “Extract the ‘descriptions’ field from the COGO points data, create a list for tonnage size using that description as the size. Use that list to define which model element corresponds with the size of each stone.”
  • “Extract the alignments in site ‘weir stone alignments’ and prepare them as polycurves and set them up the export as json files”.

These prompts generally worked well in Dynamo for Civil 3D using Autodesk Assistant Alpha and created a Dynamo graph ready to export three different JSON files for the COGO points, tonnage sizes, and alignment PolyCurves.

However, there were a few hiccups. One of them was getting all the data into one JSON file using only Autodesk Assistant Alpha. To combat that, I would split differentiating data into separate JSON files, knowing I’d reference them individually in Dynamo for Revit. Also, I had to rewrite prompts because they weren’t clear enough initially or left the prompt too open-ended. Keying in on correct prompt submissions made this workflow take mere hours to have data ready to import into Revit.

Parts of the finalized Dynamo for Civil 3D graph using only Autodesk Assistant Alpha

Implementing the Process in Revit

I have my federated model of the fish passage in Revit with topography and toposolids prepared from Civil 3D and Inventor. These elements gave 3D representation to the layering of rock required to build up the civil surfaces of the pools and ridges. Additionally, I have structural elements we are constructing in addition to civil site work such as the cofferdam, future piers, and bridge, which are geolocated in Revit.

In Dynamo for Revit, I started off by prompting Autodesk Assistant the entire workflow. My initial prompt instructed the Autodesk Assistant to:

“Import the three json files for COGO points, alignments, and one for tonnage sizes. Place a weir stone model element at each COGO point, but the model element is dependent upon the tonnage size. Round the tonnage size to the nearest whole number between 2 and 8 tons. Set it up for me to assign the model elements in Revit. Once we have the elements placed, we will extract the polycurve alignments and rotate the weir stones tangentially to the nearest point on the polycurve alignment.”

What Didn’t Work

As is expected in the Alpha stage, Autodesk Assistant in Dynamo is working through some growing pains as the team continues to improve it, and I did encounter some challenges while experimenting with it. While the overall prompt strategy was theoretically sound, it ended up being too much for Autodesk Assistant to process in a single interaction. It kept trying to build every subroutine simultaneously, which often led to circular dependencies or partially completed processes.

Other times I’d be making good progress redirecting the workflow only for my tokens to expire, forcing me to end the conversation and restart. Because Autodesk Assistant currently doesn’t retain memory between sessions, each restart meant losing all prior context, which quickly became frustrating. At that point, it was clear I needed to change my approach.

What Worked

I pivoted to a slower, more incremental workflow, basically treating the whole build-out like a checklist. I began by breaking the task into very small steps: Extract data from the JSONs, verify that data, define what format it needs to be in, and then guide the Assistant using specific nodes or suggestions instead of handing it everything at once. In a few cases, I even split workflows into totally separate graphs just to keep things simple and manageable.

Once I started working this way, Autodesk Assistant Alpha behaved much more predictably, and the graph build-out was far easier to control and understand.

Key Strategies That Helped

  • Use checklists. Because the Assistant is stateless, any crash or token timeout wipes the conversation. A checklist inside the graph gave it a clear reference point every time I restarted the chat. The checklist was one of the few manual operations I used, repeatedly copying and pasting into the prompt window for Autodesk Assistant Alpha to review.
  • Break prompts into smaller sections. The current context limitations mean Autodesk Assistant performs better when handling one focused task instead of a large, multi-layered request.
  • Give clear do’s and don’ts. For example, instruct it to use native Dynamo nodes rather than generating unnecessary Python, and limit Python edits only to specific parts of the script to prevent runaway graph-building.

Do’s and Don’ts When using Autodesk Assistant Alpha

Do Don’t
Break the workflow into small, focused tasks Give the Assistant a giant, multi-step prompt all at once
Save and reuse checklists to re-anchor after crashes Expect the Assistant to remember previous sessions
Specify when to use native nodes Let it auto-generate Python when not needed
Clearly limit the scope of script edits Allow it to expand or rebuild entire graphs unintentionally

Manually generated checklist prompt used every time I started a new Autodesk Assistant window

Final weir stone placement graph, mostly generated by Autodesk Assistant

As-built weir stone placement model. Color-coded weir stones placed and rotated

Achieving Time Savings and Workflow Efficiency with Autodesk Assistant

Using Autodesk Assistant to build this Dynamo graph and place these as-built weir stones saved days’ worth of work compared to manually locating and orientating in Revit. As more stones are physically placed, I’ll continue to receive more data. By simply updating my COGO point file and running the graph, I’ll get an updated real-time model in Build for the crew to reference. The real-world need for these stones modeled and geolocated will be met in the form of as-built drawings showing exact locations and spacings between as well as having models for the customer to potentially use for water flow data analysis.

Without Autodesk Assistant, developing a graph like this would have taken weeks. I would have been back to square one, manually placing model elements within Revit. By utilizing ideas generated by Autodesk Assistant and restructuring my prompts, I had geolocated weir stones in Revit within a day.

Being able to leverage Autodesk Assistant in Dynamo represents a huge step forward in the evolution of design automation. By unlocking natural language processing alongside data-driven scripting tools, Autodesk has shortened the journey from conceptual direction to executable workflow. As this case study demonstrates, teaming up with AI allows us to perform complex, site-specific modeling faster, more accurately, and with far fewer manual adjustments.

Note from the Dynamo team: Would you like to try out Autodesk Assistant in Dynamo? Sign up for the Alpha today! More details in this Dynamo forum post.

The post AI-Powered Automation: Leveraging Dynamo’s Autodesk Assistant (Alpha) to Place As-Built Weir Stones into a Revit Model appeared first on Dynamo BIM.

]]>
https://dynamobim.org/ai-powered-automation-leveraging-dynamos-autodesk-assistant-alpha-to-place-as-built-weir-stones-into-a-revit-model/feed/ 0
Dynamo Core 4.0 Release https://dynamobim.org/dynamo-core-4-0-release/ https://dynamobim.org/dynamo-core-4-0-release/#respond Tue, 25 Nov 2025 16:20:56 +0000 https://dynamobim.wpenginepowered.com/?p=56362 Welcome, friends of Dynamo, to another exciting release: version 4.0! This release packs a punch with multiple performance improvements, a selection of quality-of-life improvements, and several bug fixes. Important changes also include migration to .NET 10 and a default Python engine change, so don’t miss the details below. This release is lighter on new features, …

The post Dynamo Core 4.0 Release appeared first on Dynamo BIM.

]]>
Welcome, friends of Dynamo, to another exciting release: version 4.0! This release packs a punch with multiple performance improvements, a selection of quality-of-life improvements, and several bug fixes. Important changes also include migration to .NET 10 and a default Python engine change, so don’t miss the details below.

This release is lighter on new features, but for a few good reasons. First, we put significant effort on behind-the-scenes updates including the .NET 10 migration and PythonNet3 update to ensure Dynamo’s stability and future readiness. In addition, the team has been hard at work building an agentic AI future for Dynamo. These features are now available as part of an alpha, which you can sign up for today!

TL:DR

Dynamo Core 4.0 introduces enhanced performance for several nodes, improvements to groups, better uploading experience for packages with lots of files, paneling nodes out of experimental mode, sample graphs in Dynamo Sandbox, and more. This release also sees PythonNet3 becoming the default Python engine, API changes, and a default path change for Dynamo resources.

What is Dynamo and its flavors?

What is Dynamo Core?

Dynamo Core is a collection of bundled components that consist of the graphical interface, the compute engine, the scripting language DesignScript, and the out-of-the-box nodes that are not specific to another program like Revit or Civil 3D.

What is Dynamo for <INSERT HOST HERE>?

Dynamo for [Revit, Civil 3D, FormIt, Advance Steel, Alias or Robot Structural Analysis] is a collection of host-specific nodes that work with Dynamo Core and run inside of said host.

What is Dynamo Sandbox?

Dynamo Sandbox is for package developers and other folks working with Dynamo code who want to stay up to date with the latest and greatest stuff coming out. Sandbox is Dynamo’s “Core” functionality distributed in a way that doesn’t interfere with other Dynamo installations and doesn’t require any other applications (except for a few windows components and some optional extras). You can read more about this distinction here.

So, what’s dropping with Dynamo 4.0?

Getting even faster: Performance updates

Our previous release delivered significant performance gains, and in that blog post we hinted at continued improvements. Several of those have arrived today in the form of geometry handling, optimized Boolean nodes, and improved PolySurface.ByJoinedSurfaces and Point.Project nodes. Let’s dive into the details.

PolyCurve offset operations

We noted that the function PolyCurve::offset() was doing a complex check just to see if a shape was a single straight line. We’ve simplified this by counting components and performing geometry operations only when necessary.

Result with multiple PolyCurve nodes: 1.5x faster.

Boolean nodes

Boolean nodes that take in multiple input geometries are optimized. We made the process of handling multiple objects more efficient by no longer creating unnecessary duplicates during Boolean operations.

Result for Solid.ByUnion: about 10x faster.

Performance improvement for Solid.ByUnion

Improved geometry intersect and geometry distance functions

We’ve also improved performance of two of the top 100 most used nodes in Dynamo: Geometry.Intersect and Geometry.IntersectAll. Improvements for these nodes were tested on a graph with 1,600 faces (each a circular disk).

  • Result for Geometry.Intersect and Geometry.IntersectAll: 5-6x times faster.
  • Result for Geometry.DoesIntersect: about 30x faster.
Performance gains of Geometry.Intersect and Geometry.IntersectAll nodes

Surface join and projection operations

We’ve enhanced the way surfaces are joined together, significantly speeding up the process. What previously took 262 seconds to complete now completes in 2 seconds. This improvement is achieved by using a more efficient algorithm.

  • Result for PolySurface.ByJoinedSurfaces: 131x faster.
  • Point.Project has also been improved. Result: 400x faster.
Performance gains of PolySurface.ByJoinedSurfaces and Point.Project nodes

We hope you enjoy these efficiency gains. Tip: You can use the TuneUp extension (seen in the screenshots throughout this section) to monitor node execution times in your graphs.

PythonNet3 is now the default Python engine

With Dynamo 4.0, we’re thrilled to share a milestone that has been years in the making: PythonNet 3 is now the default Python engine for all new Python nodes. This change is a significant moment in the ongoing evolution of Python scripting in Dynamo. 

Our team (and many of you in the community) have invested a huge amount of time closing the gaps between IronPython and CPython. We’ve worked hard to ensure that PythonNet 3 provides a modern, reliable, and sustainable path forward in Dynamo for all of you Python lovers. See this blog post for detailed information on how to migrate your graphs.

If you’re curious about the story behind this work, we shared a deep dive in our earlier blog post on PythonNet 3. 

Why CPython is not moving forward as a Dynamo engine

As we worked to bring CPython in Dynamo, the team spent a lot of time closing the long-standing gap between the IronPython world and the broader .NET ecosystem that PythonNet connects to. This meant working with community members to dig into real workflows, smoothing out compatibility pain points, and making sure PythonNet 3 could stand confidently where both IronPython and CPython had previously served different needs. 

Along the way, one thing became clear: PythonNet 3 now covers nearly all the scenarios where CPython had been helpful, without introducing another engine for you to manage. After reviewing migration data, community graphs, and real project behaviors, we found that only a very small number of cases behave differently when moving from CPython  to PythonNet 3. At the same time, supporting CPython as another engine would make the ecosystem more fragmented and significantly increase testing and maintenance complexity for us and everyone. It would ultimately slow us down without meaningfully improving your scripting experience. 

So, while CPython played an important part in this transition period, we’re choosing to focus on a single, modern Python engine that keeps Dynamo simpler, more maintainable, and easier for all of you to rely on. 

And because we know change can still be disruptive, we’ve put a lot of care into making the migration experience feel gentle, predictable, and well-supported. 

What you’ll see when using Python in Dynamo now

New graphs 

All new Python nodes created in Dynamo 4.0+ start with PythonNet3. Don’t worry about backward compatibility: For those who work in multi-version shops (e.g., Revit or Civil 3D 2025/2026), install the PythonNet3 Engine package in Dynamo 3.3–3.6 to maintain compatibility.  

Working with existing graphs using CPython 3 

Because CPython is no longer supported, Dynamo will automatically migrate CPython nodes to PythonNet 3. 

Here’s what happens: 

  1. backup copy of your original file is created automatically.
  1. All CPython nodes (including custom nodes that use CPython) are converted to PythonNet3. 
  1. A toast notification lets you know how many nodes were migrated.
  1. When saving, you’ll see a reminder that your Python nodes will now use PythonNet3. 

Again, don’t worry about backward computability: For those who work in multi-version shops (e.g., Revit or Civil 3D 2025/2026), install the PythonNet3 Engine package in Dynamo 3.3–3.6 to maintain compatibility.  

Notifications let you know that the graph has been migrated to PythonNet3.

Working with existing graphs using IronPython 2.7 or IronPython 3 

If your graph uses an IronPython engine, there’s no auto-migration

  • If the matching IronPython package is installed, your graph runs normally. 
  • If it’s missing, you’ll see a dependency warning in the Workspace References extension asking you to download the package. 

You can continue using IronPython by reinstalling the package. But because IronPython hasn’t been updated in years and Dynamo hasn’t been actively supporting these engines in Dynamo for quite some time, we strongly recommend migrating to PythonNet3 to ensure your graphs keep working reliably going forward. While DynamoIronPython2.7 and DynamoIronPython3 will continue to remain available as packages on the Dynamo Package Manager they will no longer be maintained by the Dynamo team. 

In this case, the migration option available to you is node-by-node migration utilizing the Migration Assistant available within the Python Editor.  

We’ll also publish a detailed migration guide soon with tips, gotchas, and recommendations to help you convert CPython and IronPython code with confidence. 

This animation illustrates how the Python Script node notifies you of the update. You can use the Migration Assistant tool in the editor to update your code.

Notification Preferences 

If you prefer fewer reminders, you can turn off Python-related toast notifications in Preferences > Features > Python. 

A note for package authors

If your package includes Python nodes, we’d love for you to update them to PythonNet3 so the community can keep relying on your work. We’ve tried to make this as easy as possible with the Migration Assistant (and automatic migrations for CPython). Thank you for everything you share with Dynamo users! 

Better collapsed groups to keep your graphs clean

Ever since Dynamo 3.5 and 3.6, we’ve been steadily improving groups—freezing them, searching inside them, adding smarter context menus
 and listening closely to your feedback along the way. Many of you (especially our Grasshopper-loving friends!) told us how much you rely on groups and clusters to simplify complex scripts and how you’d love something similar in Dynamo. While we’re not fully at parity yet, these improvements is another meaningful step in that direction. 

In the past, collapsing a group hid all the nodes inside but maintained the size of the group. That meant you still had to zoom and pan around your large graphs to navigate them, which wasn’t ideal. 

Caution: wide load! Full-size collapsed group in action

Now, collapsed groups can shrink to a minimal, node-like footprint, so you can simplify your graph visually and move around much more easily.

Before: Wide collapsed group. Now: Compact collapsed group.

If you build large or intricate graphs and just want things to simple and organized, collapsed groups now help you simplify your workspace without losing clarity or control. You can collapse them down and choose to show only the ports that matter.

Previously, when code blocks were placed within a collapsed group, it was challenging to determine the representation of the output port; tooltips provided the only means of identification, which was not always sufficient. Some of you even discovered our little hidden Easter egg in the previous release already. With this release, we’ve continued and expanded that improvement to collapsed groups, making it official: now code block output ports inside collapsed groups show the variable name or type, so you can instantly understand what you’re connecting together without having to hover or guess.

  • New settings under Preferences > Visual Settings > Groups let you collapse groups into a minimal, node-like footprint—perfect for decluttering dense graphs.
  • Hide optional or unconnected ports in collapsed groups to keep things tidy. 
  • And if you later connect to a hidden port, the group updates automatically to show it. 
On the left side [ 1 ], the collapsed group settings are toggled off. The collapsed group is the same width as its expanded form, and optional ports are shown by default. On the right side [ 2 ], the settings are toggled on, which gives the collapsed group a compact size and hidden optional ports.
  • When you maximize a collapsed group, other nodes and groups move aside to avoid overlapping, thanks to auto-layout.

Other group updates

  • Node Autocomplete now correctly adds the node inside the triggering node’s group.
  • We also fixed an issue where custom group styles were duplicated when closing and reopening Dynamo for Revit and Civil 3D. Group styles will no longer be duplicated.

New PolyCurve behavior

In Dynamo 4.0, we have enabled new PolyCurve behavior to make it more predictable. PolyCurve direction is now consistently determined by the direction of the first curve from the input curves array that belongs to it. This may lead to breaking behavior in old graphs in favor of consistency and predictability. If you wish to preserve old directionality, you can set “DefaultEnableLegacyPolyCurveBehavior” to “true” in your DynamoPreferences.xml.

API changes

API and nodes that were marked as obsolete in 1.x have been removed in Dynamo 4.0. You can reference the full list of changes here.

Going forward, we will continue pruning obsolete API and nodes. For example, in release 5.0, we are planning to remove those API and nodes that were marked as obsolete in 2.x. This waiting period gives users time to update their graphs and extensions. Nothing is being marked as obsolete in 4.0, but we will notify you of any potential changes in upcoming point releases.

Dynamo migrated to .NET10

When you launch Dynamo 4.0, you’ll be asked to update to .NET10 if you haven’t already. The .NET10 migration effort ensures Dynamo remains aligned with Microsoft’s technology roadmap, well ahead of the end of support for .NET 8 in November 2026. The user-facing impact of this migration is minimal; primarily, icons for custom nodes will break, and package authors will need to update them.

For assistance, see Updating Your Packages and Libraries for Dynamo 4.0 and the “Adding Custom Icons to Zero Touch Nodes” section of Advanced Dynamo Node Customization.

Default path update

Dynamo has moved to using AppData and Program Files instead of loading resources from Program Data as the default location for packages, samples, templates, and other items. Program Data is shared between multiple users on Windows, so if a user were to install something malicious there, it would be available for all users on that machine. The new default path helps avoid this issue.

Quality-of-life improvements

  • A new splash screen image lets you know you’ve entered a new world of Dynamo goodness.
  • Package Manager is now better able to handle package uploads with a large number of files with the introduction of asynchronous processing and upload progress information.
  • Paneling nodes were introduced in release 3.1 in experimental mode and allow you to create panels on surfaces. Now, paneling nodes are out of experimental mode and part of the standard node library, so you no longer need to enable them in settings to be able to use them.
  • Sample graphs are now available for Dynamo Sandbox, and you can find them in the top menu under Help > Samples, or on the home screen in the left Samples tab. Previously, these were only available when running Dynamo under a host application.
  • The Graph Properties panel has moved from the Extensions menu to the File menu.
  • The purple syntax highlighting in Python nodes has been replaced with red so that purple can be reserved for AI features.

Bug fixes

  • We ensured that sign-in functionality and data collection agreements no longer appear while in no-network mode.
  • We fixed an issue where the pan tool would deactivate after panning once. Now, the pan tool will remain active until the user deactivates it.
  • We fixed a rendering issue with the Color Range node.

I like what I see! How can I get my hands on Dynamo 4.0?

Dynamo 4.0 will be made available in our host integrations at a future date and can be explored right now through the dynamobuilds.com website or the GitHub build page â€“ available in the Sandbox version of Dynamo. Then, drop us a line on the forum!

Want more detail? Check out the release notes

For more information on other minor features, bug fixes, and known issues in Dynamo 4.0, take a look at the release notes!

With each release, Dynamo grows more and more powerful, and we’re so grateful to have you along for the ride. Every improvement we make is for our community members, and we rely on your ideas, feedback, and inspiration to continue growing! Curious to see what else we’re working on? Visit Dynamo Roadmap, where you can take a look at current and upcoming work, express your support for features you want to see, and let us know what you think by adding a comment.

The Dynamo Team

The post Dynamo Core 4.0 Release appeared first on Dynamo BIM.

]]>
https://dynamobim.org/dynamo-core-4-0-release/feed/ 0
Turning Energy Models into Design Narratives with Agentic Nodes in Dynamo https://dynamobim.org/turning-energy-models-into-design-narratives-with-agentic-nodes-in-dynamo/ https://dynamobim.org/turning-energy-models-into-design-narratives-with-agentic-nodes-in-dynamo/#respond Mon, 03 Nov 2025 14:39:36 +0000 https://dynamobim.wpenginepowered.com/?p=56337 Artificial intelligence is reshaping how design teams in the Architecture, Engineering, and Construction (AEC) industry approach early-phase project documentation. At the Dynamo Hackathon in Nashville, we joined peers from across the industry to experiment with AI capabilities embedded directly inside Dynamo. The challenge? Explore use cases where structured building data could be paired with AI …

The post Turning Energy Models into Design Narratives with Agentic Nodes in Dynamo appeared first on Dynamo BIM.

]]>
Artificial intelligence is reshaping how design teams in the Architecture, Engineering, and Construction (AEC) industry approach early-phase project documentation. At the Dynamo Hackathon in Nashville, we joined peers from across the industry to experiment with AI capabilities embedded directly inside Dynamo. The challenge? Explore use cases where structured building data could be paired with AI to solve real-world engineering problems. 

Our team pitched and then delivered a workflow that transforms early-phase Revit energy models into schematic design (SD) narratives with just the push of a button. 

A Unique Scenario: From Massing Model to Narrative 

We focused on a common but time-intensive requirement: producing design narratives. Every engineering trade is asked to provide these narratives early in design. Traditionally, these documents take hours to write, require careful cross-checking, and often vary in quality or style between teams. 

Instead of starting from scratch, we leveraged Revit’s analytical energy model. Even a simple massing form creates a data-rich model containing spatial information, loads, and system assumptions. Using Dynamo, we gathered this data and paired it with prompt instructions and example narratives. AI then converted the structured data into a draft SD narrative, one that could be exported directly into Word for sharing with project teams and owners. 

For reference, here is a snippet of the data in Revit that went into the design narrative:

Why It Matters: Push-Button Narratives 

Most engineers aren’t interested in prompting AI directly. Running the workflow through Dynamo Player meant no typing instructions, no trial and error. They just click one button to generate a narrative. 

The output? A professional SD narrative aligned with the energy model data. Owners gain transparency into early design assumptions, while engineers save time, utilize their Revit data, and improve consistency across projects. 

For a firm like IMEG, which executes roughly 2,000 projects per year, the return on investment is significant. Writing narratives takes about 4 hours per trade. At 3 trades per project and a $50/hour billable rate, this equates to over $1 million in potential savings annually, all while standardizing deliverables across teams. 

Methodology: Building the Graph and Engineering the Prompt

To make the AI agent truly useful, we had to teach it how to “think like an engineer.” The Dynamo graph we built used the new Dynamo Agentic nodes to interact with the Revit model and extract relevant analytical data. The core prompt we used was: 

“You are a Dynamo, Revit, and BIM engineer and expert. Please use the active Revit model to find the following information. Collect the elements in the model of the Analytical Spaces category. Then use Standard nodes to collect the following parameters and provide them in lists as well as sum them up. Properties: Area, Peak Heating Load, Peak Cooling Load, Peak Latent Cooling Load, Outdoor Airflow. Create a list by Reference Level of the building.” 

This prompt evolved through several iterations. We discovered that using user-readable names, the ones shown in Revit’s Properties Palette was more effective than relying on internal API terminology. In some cases, we had to hint the Revit API directly to access certain parameters, especially when the AI struggled to interpret ambiguous or technical terms. 

We also experimented with different ways of grouping and summarizing the data. For example, organizing by “Reference Level” helped contextualize the narrative and made the output more intuitive for design teams. 

Here’s a snippet of the graph showing the data extraction and prompt construction: 

This iterative process helped us refine the agent’s understanding and improve the quality of the generated narratives. Each tweak to the prompt or graph structure brought us closer to a reliable, repeatable workflow. 

Once we had a graph doing data extraction, we were able to pair it with the LLM (Large Language Model) nodes to call OpenAI and Claude to ask the AI to take a sample narrative and prompt it to write a narrative using the information we feed in via the graph output.  

We found some issues and variations in prompting the AI to output something usable. We settled on taking the Revit data as a dictionary string in a JSON format and connecting that with the following prompt sample: 

“Here is an Example of a Schematic Narrative for HVAC:

The proposed HVAC system is designed to support a building with a total conditioned floor area of approximately ftÂČ. Based on load calculations, the facility requires a peak heating capacity of 850 MBH, a peak cooling load of 120 Tons, and a peak sensible cooling load of 95 tons. Based on the site the suggestion is to use a Packaged Rooftop unit with VAVs with electrical Reheat.

Here is the data of my project: We gave the “Stringyfied” data that we had previously extracted.

Please Write a narrative like the example based on the given using the data provided.”

It quickly gave us an output file on the local PC with the following text:

“The proposed HVAC system is designed to serve a facility with a total conditioned floor area of approximately 121,860 ftÂČ, distributed across three floors (Building 1 Level 1, Level 2, and Level 3). Based on schematic load calculations, the building requires a peak heating capacity of approximately 1,359 MBH and a peak total cooling load of around 167 tons, with a peak latent cooling load of 26 tons. The ventilation design accommodates an outside air requirement of about 35,756 CFM to meet both indoor air quality and code requirements. The schematic recommendation is to utilize packaged rooftop units equipped with variable air volume (VAV) terminal units featuring electrical reheat. This approach allows for efficient load management across diverse zones and optimal energy utilization while maintaining occupant comfort throughout the building.”

We did find that providing context and boundaries around roles and rules helped. For example, we used:

“Only respond with “Failed” or the resultant narrative no other text. You are a Professional Mechanical Engineer you job is to help review data and write a schematic narrative describing the overall scope of the project. Knowing its schematic, you need to hit the key points of design for the system. use the following sample as guidance for writing an overall report.”

We noticed that otherwise the LLM only being given one interaction can sometimes provide a lot of extra info we would have to edit out of the report.  

Lastly, we wrote a simple save to .txt file python command that saved the extension as a .doc so that word would auto open and interpret the results as a document.  

import os 
body_text = IN[0] 
file_name = IN[1] 
folder_path = IN[2] 
# Clean up the file name (remove any quotes or extra spaces) 
file_name = file_name.strip().strip('"').strip("'") 
# Ensure .doc extension 
if not file_name.lower().endswith(".doc"): 
    file_name += ".doc" 
out_path = os.path.join(folder_path, file_name) 
# Write the text 
with open(out_path, 'w', encoding='utf-8') as f: 
    f.write(body_text) 
OUT = out_path 

Then we set up Dynamo Player inputs and also were able to run it with Dynamo Player. We thought this is an easier way to Deploy to other Staff in the organization who may not be Dynamo experts! 

For firms like Osborn Engineering with high volume of Sports Facilities Designs, Iteration of shapes and sizes of building scope in schematic design is challenging. Removing the team from providing rapid design feedback at early stages. With a tool like this leveraging AI to help solve design problems and limit time writing reports will help on larger scale projects.  

Results: A New Way to Deliver Value 

By the end of the hackathon, we achieved a working proof of concept: 

  • Extracted energy model data directly from Revit 
  • Pushed it through Dynamo’s AI nodes 
  • Generated an SD narrative in minutes 
  • Exported the output to a Word document 

The result was exactly what we were aiming for. We got a clear, client-facing narrative aligned with Revit model data, executed with a click from Dynamo Player. 

Looking Ahead: Version 2 and Beyond 

We left Nashville excited about what comes next. Future iterations will: 

  • Refine prompts for even higher-quality outputs 
  • Automate Word exports with company branding and standardized formatting 
  • Extend the workflow to other repetitive documentation tasks like submittals and reports 
  • Integrate with Dynamo as a Service (DaaS) in Autodesk Forma

This hackathon proved that AI inside Dynamo isn’t just an experiment, it’s actually a glimpse at how documentation-heavy workflows in the AEC industry can be automated, standardized, and improved at scale. 

Takeaways 

The project highlighted more than just technical feasibility. It showed how AI and structured design data can be combined to create immediate business value. We also saw firsthand how Autodesk is democratizing access to AI tools, making advanced workflows accessible to both experts and beginners. 

The future isn’t just about new features in Dynamo or Revit, it’s about engineers working smarter, not harder, and using these tools in creative new ways. As interoperability and automation continue to expand, the barriers between design data and client deliverables will keep shrinking. The possibilities are endless, and this is only the beginning.  

Meet the Team

The post Turning Energy Models into Design Narratives with Agentic Nodes in Dynamo appeared first on Dynamo BIM.

]]>
https://dynamobim.org/turning-energy-models-into-design-narratives-with-agentic-nodes-in-dynamo/feed/ 0
Seamless Dynamo-Forma Connection: Select Proposal Elements Node https://dynamobim.org/seamless-dynamo-forma-connection-select-proposal-elements-node/ https://dynamobim.org/seamless-dynamo-forma-connection-select-proposal-elements-node/#respond Mon, 13 Oct 2025 15:52:44 +0000 https://dynamobim.wpenginepowered.com/?p=56286 Hello Dynamo enthusiasts, We are excited to introduce a new addition to the Dynamo Forma package: the Select Proposal Elements node. This node is designed to remove one of the biggest blockers in your workflow; It allows you to bring Forma elements into Dynamo without the need to open Forma at all, making it easier …

The post Seamless Dynamo-Forma Connection: Select Proposal Elements Node appeared first on Dynamo BIM.

]]>
Hello Dynamo enthusiasts,

We are excited to introduce a new addition to the Dynamo Forma package: the Select Proposal Elements node. This node is designed to remove one of the biggest blockers in your workflow; It allows you to bring Forma elements into Dynamo without the need to open Forma at all, making it easier than ever to focus on your scripts and algorithms. It also eliminates network connections issues while using Forma Dynamo Player that surface occasionally.

Why we made it

Previously, running Dynamo scripts in Forma was a repetitive process. You had to:

  1. Prepare your script in Dynamo without Forma knowledge.
  2. Run your graph in Dynamo Player within Forma to fetch data.
  3. Switch back to Dynamo to work on your graph now that it has Forma data.
  4. Return to Dynamo Player within Forma to run your graph and test the results.

This back-and-forth introduced a new paradigm to working within Dynamo from Forma; The need to run from inside Forma to send data to Dynamo or send data from Dynamo back.

 

The current workflow for setting up Dynamo-Forma connection

 

What the node does

The Select Proposal Elements node allows you to:

  • Bring Forma elements to Dynamo without running your script in Forma.

 

  • Eliminate the need for unnecessary filter nodes by letting you pick the elements you want to work with directly.

The Select Proposal Elements node replaces multiple nodes making your graph cleaner

 

  • Visualize Forma elements in Dynamo within their project context.

Review your Forma proposal in Dynamo

 

  • Work seamlessly across multiple hubs, projects, sites, and proposals.

Target multiple Forma proposals within a single Dynamo workspace

 

    • Keep your element selections when sharing your script.

Replace the Select Proposal Elements node to get the latest version of your Forma proposal

 

  • Prepare for future integrations with Data Exchange and AEC Data Model environments.

Key workflows

Here are some examples of how you can use the Select Proposal Elements node:

  1. Work with what you need, not everything
    You can import only the elements you need from Forma and work on them in Dynamo using a simplified node layout to create your designs. This helps you stay focused on the design logic instead of setup complexity.

By selecting only what you need, you can create outcome-based workflows

 

  1. Reviewing updates in Forma projects
    Use the node to check design updates in your Forma proposal without leaving Dynamo. This allows you to keep the design context visible while refining your workflow.

Replace the Select Proposal Elements node to get the latest version of your Forma proposal

 

Known limitations

We are starting with Forma as the first supported environment, but this is just the beginning.

Here are the current limitations:

  • Missing water texture.
  • You cannot select an entire building at once if it contains floors in the node’s 3D viewer environment. Currently, the selection is limited to floor-level elements.
    To select all floor elements, you can either:

    • Use the Hierarchy View and select the parent element of the floors, or

    • Use the Shift selection method in the By Category View — hold Shift and click the first and last elements to select the entire range.

Use Ctrl + Click to add to your selection in the 3D Viewer

 

Future Roadmap

The Select Proposal Elements node significantly reduces the contextual knowledge required to work with Forma nodes. It eliminates the need to juggle between between Forma and Dynamo environments and minimizes the number of nodes you need in Dynamo. This is part of a larger vision where the same workflow will be extended to other cloud environments such as Data Exchange and AEC Data Model. By simplifying the process today, we are building towards a future where Dynamo is one of the powerful platforms for managing data-driven workflows across multiple environments. In the future, we are considering improving this node by:

  • Connecting it to multiple platforms (Data Exchange, AEC Data Model)
  • Allowing rule-based filters

Conclusion

The Select Proposal Elements node is the first step toward a more unified and accessible data workflow in Dynamo. You can use it together with Select Site and Send To Forma Proposal nodes to get and send data to Forma without opening Forma. It also helps you:

  • Stay in one environment,
  • Reduce the learning curve, and
  • Get prepared for future integrations

We would love to hear your feedback. Try out the new node, share your experience, and help us shape the next iteration. Your input will be key as we continue to evolve this workflow for the Dynamo community.

The Dynamo Team

The post Seamless Dynamo-Forma Connection: Select Proposal Elements Node appeared first on Dynamo BIM.

]]>
https://dynamobim.org/seamless-dynamo-forma-connection-select-proposal-elements-node/feed/ 0
Dynamo for Civil 3D 2026.1 Release https://dynamobim.org/dynamo-for-civil-3d-2026-1/ Tue, 05 Aug 2025 18:08:23 +0000 https://dynamobim.wpenginepowered.com/?p=56250 Hey Civil 3D users, Hope you are enjoying the summer! It’s crazy to think that we’re already in August, with Autodesk University right around the corner. How the time flies… Speaking of which, it’s already time for another Civil 3D release! The Civil 3D 2026.1 update is now live, and it is packed with goodies. …

The post Dynamo for Civil 3D 2026.1 Release appeared first on Dynamo BIM.

]]>
Hey Civil 3D users,

Hope you are enjoying the summer! It’s crazy to think that we’re already in August, with Autodesk University right around the corner. How the time flies…

Speaking of which, it’s already time for another Civil 3D release! The Civil 3D 2026.1 update is now live, and it is packed with goodies. There’s a new horizontal regression analysis tool for creating alignments from survey data, enhancements to the 3D Model Viewer, and a ton of new features in Autodesk Drainage Analysis for Civil 3D. You can read all about it in the Civil 3D 2026.1 release notes.

What about Dynamo? We’ve got you covered there, too! The big improvement in this release is the inclusion of in-depth documentation and sample files for all nodes in the Civil 3D section of the library. On top of that, we’ve also upgraded Dynamo to version 3.5, as well as made several smaller improvements and fixes.

To get all of this great new stuff right now, download and install the Civil 3D 2026.1 update using the Autodesk Access desktop app, or download it by signing in to your Autodesk account on manage.autodesk.com.

TL:DR

The Civil 3D 2026.1 update makes Dynamo easier to learn by adding in-depth documentation and sample files for nodes in the Civil 3D section of the library. To view this information, right-click on a node and then click Help
, or select the node and press F1.

Dynamo Core has been updated to version 3.5, bringing you a new package publishing experience; the new Curve Mapper node, with nine different curve types; new List.ReplaceItemAtIndices and List.GroupBySimilarity nodes for list manipulation; a trim boundary condition for paneling nodes, and more.

So, what’s dropping in Dynamo for Civil 3D 2026.1?

In-depth documentation for Civil 3D nodes

Dynamo’s Documentation Browser has been around for awhile now. It is your go-to spot when you want to learn more about what a particular node does. We’ve been on a journey to add content for all nodes, starting with the AutoCAD section of the library in the Civil 3D 2025.2 update. In this release, we’re completing the journey with a sweeping content update for all ~1000 nodes in the Civil 3D section of the library! This should go a long way to making Dynamo easier to learn for both beginner and advanced users.

As a refresher, you can view the help content for a node by simply right-clicking on the node and then clicking Help
, or selecting the node and pressing F1.

And don’t forget, you can also insert the example graph directly into your workspace!

Dynamo Core 3.5

Besides the continual improvements that we’re making in Civil 3D, Dynamo Core is also advancing at the speed of greased lightning. Civil 3D 2026.1 includes Dynamo Core 3.5, which includes some lovely improvements to the package publishing experience and some highly-requested new nodes for managing lists. There is also a new Curve Mapper node that you can use to create complex geometry and patterns. We won’t repeat everything here, but please do check out the Dynamo 3.5 release post.

Smaller buckets of work

There are also a few smaller items that you should know about:

  • There’s a new node called Alignment.RenumberTags node that automatically renumbers all tags along an alignment according to their geometric order
  • The performance of the FeatureLine.SetElevationsFromSurface and FeatureLine.SetPointElevation nodes have been significantly improved

Bug fixes

As always, we’re continuously working to improve the quality of Dynamo for Civil 3D, and we were able to fix several bugs in this release. Thanks again to those of you that brought these issues to our attention. It is greatly appreciated! If you notice something that seems off while using Dynamo, please don’t hesitate to let us know. Here’s a snapshot of the most important fixes in this release, and you can view the full list in the release notes.

  • Fixed an issue where the BaselineRegion.SetOffsetTargetFrequency node would not set the correct value for offset target geometry points
  • Fixed an issue where the Parcel.UserDefinedProperties node would produce a warning when the UDPs of the parcel were used in a label expression
  • Fixed an issue where labeling nodes did not support child label styles
  • Fixed an issue where the CogoPoint.SetLabelStyle node would not allow for setting the label style to <default>
  • Fixed an issue where the Block.Extents node would incorrectly return null when the block definition contained certain types of invalid geometry
  • Fixed an issue where the Profile.BySurface and Profile.ByStationsElevations nodes would return null when the profile object layer did not exist in the drawing

What’s next?

There are some very cool things in the works both for Dynamo and for Civil 3D. Be sure to stay tuned to the public roadmaps for both Civil Infrastructure and Dynamo Core, where you can take a look at current and upcoming work, express your support for features you want to see, and let us know what you think by adding a comment.

Stay cool, and see you soon at AU!

The Autodesk Civil Infrastructure Team

The post Dynamo for Civil 3D 2026.1 Release appeared first on Dynamo BIM.

]]>
Dynamo Core 3.6 Release https://dynamobim.org/dynamo-core-3-6-release/ https://dynamobim.org/dynamo-core-3-6-release/#respond Tue, 29 Jul 2025 17:17:10 +0000 https://dynamobim.wpenginepowered.com/?p=56190 Summer is heating things up in the Northern Hemisphere, and the Dynamo team has been cooking up some smoking hot improvements to your experience! This release represents a culmination of long-standing efforts to improve Dynamo’s performance, with significant gains achieved that Dynamo users should notice. In addition to these performance gains, we’ve added exciting new …

The post Dynamo Core 3.6 Release appeared first on Dynamo BIM.

]]>
Summer is heating things up in the Northern Hemisphere, and the Dynamo team has been cooking up some smoking hot improvements to your experience! This release represents a culmination of long-standing efforts to improve Dynamo’s performance, with significant gains achieved that Dynamo users should notice. In addition to these performance gains, we’ve added exciting new features to our toolset. Read on to learn all about them!

TL:DR

This release introduces a revamped and more discoverable Node Autocomplete experience, enhancements to groups, and bug fixes. In addition, it adds significant performance updates that are all about turbocharging your Dynamo navigation experience, making it faster and more responsive, thus minimizing your wait time.

What is Dynamo and its flavors?

What is Dynamo Core?

Dynamo Core is a collection of bundled components that consist of the graphical interface, the compute engine, the scripting language DesignScript, and the out-of-the-box nodes that are not specific to another program like Revit or Civil 3D.

What is Dynamo for <INSERT HOST HERE>?

Dynamo for [Revit, Civil 3D, FormIt, Advance Steel, Alias or Robot Structural Analysis] is a collection of host-specific nodes that work with Dynamo Core and run inside of said host.

What is Dynamo Sandbox?

Dynamo Sandbox is for package developers and other folks working with Dynamo code who want to stay up to date with the latest and greatest stuff coming out. Sandbox is Dynamo’s “Core” functionality distributed in a way that doesn’t interfere with other Dynamo installations and doesn’t require any other applications (except for a few windows components and some optional extras). You can read more about this distinction here.

So, what’s dropping with Dynamo 3.6?

A new way to Node Autocomplete

In this release, we’re making Node Autocomplete much easier to find and amplifying it with a host of exciting improvements!

What’s Node Autocomplete? Node Autocomplete is an efficiency-boosting feature that, similar to how text processing tools suggest the next word, offers several node suggestions to connect to your selected port. These machine learning-powered node suggestions have the potential to vastly speed up your graph-building workflow, minimizing the need to spend time searching for the right node trial-and-error-style.

Sounds pretty great, right? Unfortunately, not all users stumbled upon this feature, as it was a bit hidden behind two clicks of an input or output port. In this release, we’re changing that by introducing a new icon that appears on port hover. Simply click it, and you’ll trigger the new Node Autocomplete toolbar!

 

Let’s take a closer look at this new toolbar.

New node autocomplete toolbar

 

  • [ 1 ] Ghosted node: The suggested node is shown in purple, with a purple connecting wire to indicate that it’s not yet placed. As you cycle through the suggested nodes, the preview updates. In this transient state, the node is not yet executing, so using this feature won’t slow down your graph. Once you accept the suggested node, it exits the purple “ghosted” state and is placed as a regular node.
  • [ 2 ] Documentation link: Click this icon to open the Documentation Browser inside Dynamo and learn about Node Autocomplete in more detail.
  • [ 3 ] Suggested node: Shows the icon and name of the currently shown suggested node.
  • [ 4 ] Search: Narrow down the list of suggested nodes by using the search bar. Helpful if you see a lot of suggestions and/or if you know which node you want and want to skip right to it.
  • [ 5 ] List of suggested nodes: If you’d prefer to see all the suggested nodes at once, click the drop-down [ 3 ] to view them as a list.
  • [ 6 ] Forward and back arrows: You can click these to cycle through the suggested nodes. Alternatively, you can use the arrow keys on your keyboard.
  • [ 7 ] Confirm button: Click this to place the suggested node. It will be auto-connected to the triggering port and placed using the auto-layout feature to keep your graph neat. You can also use the Enter key to do this.
  • [ 8 ] Cancel button: To exit Node Autocomplete, click this button or the Cancel key.

Got double clicking in your muscle memory? No worries—if you double click a port, you’ll see a notification reminding you to click the new Node Autocomplete icon instead.

While we think the new toolbar is pretty exciting, you can go back to the old look and feel from the Preferences menu. Under Features > Node Autocomplete, disable the Enable Floating Toolbar toggle to revert to the old version of Node Autocomplete. Note that you’ll still trigger Node Autocomplete using the icon instead of double clicking a port even with the floating toolbar disabled.

Here’s a demo showcasing first the new Node Autocomplete experience, then the old.

 

With this release, we’re entering a whole new world of graph-building efficiency. Give Node Autocomplete a try today!

A faster, more performant Dynamo

In this release, you may notice a smaller set of features. That’s because the Dynamo team has been putting a lot of focus and energy into significantly increasing Dynamo’s performance. We’ve been making steady performance improvements over time, culminating in the big push you’ll see in this version. We’re excited to introduce some of the major performance enhancements that will increase your quality of life.

Dynamo Loading

  • Accelerated recent files loading: Dynamo now loads faster, thanks to optimized handling of your recent files. No more delays as your list of large graphs piles up—it’s been streamlined!
  • Custom node optimization: We heard you love custom packages, but they slowed things down. Not anymore! Custom packages will now load twice as fast.

Graph navigation: Dynamo 3.6 is smoother and snappier!

  • Navigating large graphs with annotations is now 7% faster.
  • You will notice minimal waiting time as you zoom in and out of your large graphs.

Graph loading and closing:

  • For users of Dynamo Core, you will notice ~2x faster file loading in Dynamo 3.6.
  • For users of Revit 2025 & 2026 (Dynamo 3.3 & 3.5), you will notice 2x faster file loading in Dynamo 3.6.
  • For users of Revit 2023 & 2024 (Dynamo 2.16 & 2.19), you will notice 2.5x faster file loading in Dynamo 3.6.
  • We have enhanced the file saving and workspace closure processes. Closing a file is now up to 4x times faster. Previously, closing large graphs could take up to a minute. With this update, files now close within a few seconds.
  • Code block nodes have been optimized, improving general node load times by an average of 16%.

Heads up: Auto-Backup Interval default is now 5 minutes (previously 1 minute). You can tweak it in Preferences under General > Backup Settings.

Demo of the graph navigation improvement

Over the years, Dynamo has seen some pretty awesome performance improvements, making it faster and more joyful, including better package search, and improved and faster node search. With the dramatic performance improvements in Dynamo 3.6, there has never been a better time to upgrade to a new version of Dynamo!


**These results are average improvements based on our testing, so the performance improvements you see might also vary depending on your machine and setup.

And we’re not done yet—we’re continuing work to boost Dynamo’s performance even further, so please stay tuned.

Group enhancements

Groups are a great way to annotate and organize your graphs, and this release introduces several improvements to the group experience. More to come!

  • You can now double click a group to add a Code Block to it.

 

  • When you right click on a group, you can now use the search bar to add a node directly to that group.

 

 

  • Frozen groups are now marked with a snowflake icon. You can unfreeze the group by clicking the icon.
new interactive icon for frozen groups

 

Bug fixes

This release includes a collection of bug fixes. Here’s a selection of highlights, and you can find the full list in the release notes.

  • Previously, formulas would break after exporting new data using the OpenXML node to an existing Excel with formulas. Now, formulas will be recalculated when the user opens the file in Excel. 
  • The Custom Selection and Curve Mapper nodes now retain both the content and the selected item when the node is duplicated.

 

  • Inaccurate tooltip placements for input and output ports and collapsed groups were fixed.
  • We fixed an issue where Python engine references in workspace references were lost on restarting Dynamo.

I’m in! How can I get my hands on Dynamo 3.6?

Dynamo 3.6 will be made available in our host integrations at a future date and can be explored right now through the dynamobuilds.com website or the GitHub build page – available in the Sandbox version of Dynamo. Then, drop us a line on the forum!

Want more detail? Check out the release notes

For more information on other minor features, bug fixes, and known issues in Dynamo 3.6, take a look at the release notes!

We hope you are as excited as we are about the powerful improvements to Dynamo, including the performance gains. Thank you for being a part of this community—we notice and appreciate all your feedback, improvement ideas, and engagement! Curious to see what else we’re working on? Visit Dynamo Roadmap, where you can take a look at current and upcoming work, express your support for features you want to see, and let us know what you think by adding a comment.

The Dynamo Team

The post Dynamo Core 3.6 Release appeared first on Dynamo BIM.

]]>
https://dynamobim.org/dynamo-core-3-6-release/feed/ 0