Jeffrey Palermo, CTO & Chairman of Clear Measure, presented the AI Software Factory, an executive-level architectural pattern for orchestrating software delivery from idea to production. He opened by addressing a core problem: software delivery has become the constraint in most organizations, and teams can’t simply work faster when defects and production incidents are constantly consuming capacity. Poorly engineered AI adoption doesn’t solve this, it just ships bugs faster. The AI Software Factory is the next evolution following Agile, DevOps, and cloud adoption, orchestrating people, processes, and automation across the entire delivery lifecycle.
A key theme throughout was that visibility must come before automation. Using a live Kanban board demo and a real client project, Jeffrey showed how a weekly scorecard tracking throughput, mean time to delivery, escape defects, and production incidents reveals bottlenecks and process gaps that would otherwise stay hidden. From there, AI automation is introduced intentionally, starting with simple, low-risk tasks, and always measured against the scorecard to confirm real improvement. Clear Measure’s goal is to help organizations build software delivery systems that safely exploit AI without destabilizing their business.
Windows DevDrive is a specialized volume format that provides dramatic performance improvements for development workloads:
If you’re managing dozens of source repositories, moving them to a DevDrive can shave significant time off your daily development cycle.
Before starting, ensure you have:
winget install GitHub.Copilot.CLI or download from github.com/copilot)# Verify you're on Windows 11 or Server 2022
[System.Environment]::OSVersion.VersionString
# You should see something like:
# Microsoft Windows NT 10.0.26100.0
# Requires Administrator privileges
# Run as Administrator!
# Create a 200 GB DevDrive on D: with label "DevDrive-Repos"
New-DevDrive -Drive D: -Size 200GB -Name "DevDrive-Repos"
# Verify creation
Get-Volume | Where-Object { $_.FileSystem -eq "DevFS" }
# You should see your new DevDrive listed
⏱️ Time: 2-5 minutes for DevDrive initialization
GitHub Copilot CLI automates the entire migration process. Launch it:
copilot
Copy and paste this prompt into Copilot CLI (adapt the paths and repository count to your setup):
I have lots of repositories in C:\Users\YourUsername\source\repos and I want this path to have a virtual link to my new DevDrive at D:\. I want all the subfolders moved to D:\ and the paths to have virtual links so that old references to C:\Users\YourUsername\source\repos get redirected to D:\ instead, so that tools like Git, VSCode, and Visual Studio continue working without any configuration changes.
Replace:
YourUsername with your Windows usernameD:\ with your DevDrive path (if using a different letter)Press Enter and let Copilot CLI handle the rest.
The CLI will analyze your request and create a structured plan:
🔍 Analyzing your setup...
📋 Creating implementation plan...
PLAN CREATED:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Problem: Migrate repositories (~85 GB) from C:\Users\YourUsername\source\repos
to DevDrive (D:\) while maintaining backward compatibility
Approach:
1. Create a symbolic link at C:\Users\YourUsername\source\repos → D:\
2. Move all repositories from C:\Users\YourUsername\source\repos to D:\
3. Verify all paths resolve correctly
4. Test Git operations through the symbolic link
Strategy:
✓ Single directory link (more maintainable than individual links)
✓ Target layout: All repos directly under D:\ (D:\repo-name structure)
✓ Backward compatibility: Old paths continue working transparently
Ready to proceed? (Type 'start' to begin migration)
Type start to begin:
start
Copilot CLI will execute the migration in phases:
=== PHASE 1: PRE-MIGRATION VALIDATION ===
✓ Source directory exists
✓ DevDrive (D:\) exists
✓ Source size: 85.27 GB
✓ D:\ free space: 252.65 GB
✓ Sufficient space available
✓ Path is a regular folder (ready for conversion)
✓ No open handles detected
=== PHASE 1 COMPLETE ===
What it checks:
=== PHASE 2: MIGRATING REPOSITORIES ===
Found N repositories to migrate
Using Robocopy for efficient migration with resume capability...
Moving: project-api
✓ Success
Moving: framework-core
✓ Success
Moving: webapp-frontend
✓ Success
...
Moving: infrastructure-tools
✓ Success
=== PHASE 2 SUMMARY ===
Successfully migrated: N
Failed migrations: 0
=== PHASE 2 COMPLETE ===
📊 Migration Details:
=== PHASE 3: CREATING SYMBOLIC LINK ===
Creating symbolic link...
Source: C:\Users\YourUsername\source\repos
Target: D:\
✓ Symbolic link created successfully
✓ Symbolic link verified!
=== PHASE 3 COMPLETE ===
What happens:
mklink /J=== PHASE 4: VERIFICATION & TESTING ===
1. Testing path redirection...
Old path: C:\Users\YourUsername\source\repos
Points to: D:\
✓ Verified
2. Listing repositories via old path...
Found N repositories
✓ Path is accessible
3. Listing repositories via new path (D:\)...
Found N repositories
4. Verifying consistency...
✓ Repository counts match
5. Testing file access through old path...
Sample repo: sample-project
✓ Can access files
=== PHASE 4 COMPLETE ===
=== PHASE 5: TESTING TOOL COMPATIBILITY ===
Testing Git Integration...
Found Git repo: sample-project
Testing 'git status' via old path...
✓ Git commands work through symbolic link
Testing Absolute Path Access...
C:\Users\YourUsername\source\repos\sample-project
✓ Accessible
D:\sample-project
✓ Accessible
=== PHASE 5 COMPLETE ===
That’s it! Copilot CLI handles the entire migration automatically.
To verify everything works, test with your tools:
cd C:\Users\YourUsername\source\repos\sample-project
git status
code C:\Users\YourUsername\source\repos\sample-projectcd C:\Users\YourUsername\source\repos\my-dotnet-app
dotnet build
✅ All should work without any configuration changes.
Windows symbolic links (directory junctions) create a transparent redirect:
Application Request
↓
C:\Users\YourUsername\source\repos
↓
[Windows Kernel: This is a junction to D:\]
↓
D:\
↓
Actual Files & Directories
Once migrated, you’ll see performance gains in development workflows:
| Operation | Before (C: SSD) | After (DevDrive) | Improvement |
|---|---|---|---|
| Git clone (2 GB repo) | 45 seconds | 28 seconds | ~38% faster |
| Git status (1000 files) | 2.3 seconds | 1.8 seconds | ~22% faster |
| dotnet build | 35 seconds | 26 seconds | ~26% faster |
| npm install | 18 seconds | 13 seconds | ~28% faster |
| File enumeration (500K) | 4.2 seconds | 2.8 seconds | ~33% faster |
Results vary based on disk hardware and repository size
Solution: Run PowerShell as Administrator (Right-click → “Run as administrator”)
Solution: Check DevDrive size and ensure you have free space
$drive = Get-Volume -DriveLetter D
$freeGB = [math]::Round($drive.SizeRemaining / 1GB, 2)
Write-Host "Free space on D:\: $freeGB GB"
Solution: Close all applications accessing the repositories (IDEs, VSCode, file explorers, Git tools, antivirus) and retry.
copilotstart to executegit statusIf you need to reverse the migration:
# 1. Remove the symbolic link
rmdir "C:\Users\YourUsername\source\repos"
# 2. Move repositories back from D:\ (all data remains intact)
Migrating your development repositories to a Windows DevDrive is a low-risk, high-reward operation. Using GitHub Copilot CLI powered by Claude Haiku 4.5, you get:
In just 15-20 minutes (including DevDrive setup and verification), you can achieve sustained performance improvements across your entire development workflow—all powered by Copilot CLI and Claude Haiku 4.5.
🚀 Happy coding on your faster DevDrive!
The promise of AI in software development is that it will profoundly increase the rate of software delivery. But merely using AI tools does not deliver on that promise. Putting together an end-to-end automated process is what’s required. That is the pattern of the “AI Software Factory”.
In this webinar, you will see an AI Software Factory in motion and learn what you need to do to implement this pattern for yourself to 2x and 3x your pace of software delivery.
To enable AI tools to process information stored in existing software systems or databases, that data must reach the language model’s context window. There are only two ways to achieve this: (1) include it directly in the prompt, or (2) provide it as the result of a call to an LLM tool/function.
The Model Context Protocol (MCP) offers a standardized pattern for discovering, grouping, and enabling sets of AI tools that language models can access. However, most traditional web services are not well-suited for agentic workflows. To support true agentic patterns with your existing systems, you need an MCP server.
MCP is emerging as the new standard API for large language models.
This training will jumpstart your journey toward designing and implementing an MCP server for your custom system or database.
.NET AI Architecture for DevOps provides strategies and design patterns to enhance software development and deployment with AI integration.
AI‑driven development is transforming how .NET engineers deliver software. Instead of envisioning a fully autonomous future, this webinar presents a practical, near‑term model: enabling a skilled engineer to compress a month of feature work into a single day through AI‑Driven Development.
Jeffrey Palermo introduces an approach that pairs clear architectural decisions with high‑leverage automation and modern AI coding tools like Cursor, GitHub Copilot, and Claude. Attendees will learn how to design AI‑ready DevOps environment, use parallelized local and cloud runners, and guide LLMs to generate code, tests, and supporting artifacts with confidence and consistency.
Implementing .NET AI Architecture for DevOps ensures consistent testing, scalable pipelines, and effective AI integration.
The session covers how to structure a repeatable workflow for rapidly delivering production‑quality features—achieving 10× throughput without sacrificing quality or control.
Many pundits compare AI to a junior developer on your team. This is false. AI is code that runs on a computer and cannot operate fully by itself. It must be operated like any other sophisticated machine. However, it is a very sophisticated machine capable of building software features if the development environment is well-designed and complete.
This webinar will demonstrate the ability to fully delegate a software feature to an unmonitored AI tool. When an engineer reviews the output, it will be a fully developed feature that meets the team’s standards, including all automated tests, and a complete pull request ready for review. Move into the future with us, where you can delegate the development of easy features and changes entirely to the computer, allowing your engineers to focus on new, novel, or difficult features.
In this episode, Jeffrey Palermo joined Dan Clarke to discuss AI-driven development and DevOps, exploring how AI is changing software and the importance of implementing DevOps processes before AI can begin to leverage writing features in a reliable and consistent way.
Tune in to learn more!
Software engineering is at an inflection point. Some pundits claim that all programming will be done by AI in the future, while others claim that it’s all hype. We will find the reality somewhere in the middle. This webinar dives into what is possible now with Visual Studio 2026 and .NET 10 regarding AI-Driven Development.
Why do so many software projects go off track?
This video reveals the three key elements that set successful projects apart: the solution, the project, and the team—all before execution begins.
Join us for a deep dive into the future of intelligent software development, where performance meets productivity. This webinar explores how Visual Studio 2026 and .NET 10 are reshaping the developer experience with AI-powered tooling and runtime enhancements, enabling massive increases in software team throughput.
Discover how Visual Studio’s redesigned interface, faster build times, and integrated AI assistants streamline architecture decisions and accelerate delivery. If you’re building scalable enterprise systems or experimenting with AI-driven workflows, join Clear Measure Chief Architect Jeffrey Palermo in this session to equip you with the insights to architect smarter, faster, and more maintainable solutions.