Microsoft is porting the TypeScript compiler to Go, resulting in a 10x speed boost. This article explains why Go was chosen over Rust and C#, why the compiler was ported instead of rewritten, and what this means for developers — including faster builds, improved CI/CD performance, and better editor responsiveness.
If you are a developer who has been working in the JavaScript/TypeScript ecosystem for a long time, the last couple of weeks have been quite interesting. In what can be considered one of the most pivotal moments of the past decade, Microsoft announced that they are porting the TypeScript compiler to Go. While the port to an entirely new language is big news in and of itself, it was also announced that this move will result in a 10x faster compiler.
In this article, we will look into the significance of this move and what it means for TypeScript developers. Let’s dive in!
When the language was originally designed in 2012, the TypeScript team chose to implement the compiler in TypeScript itself. This meant that the TypeScript code written by developers like us would also pass through code written in TypeScript (the compiler). This decision was made to ensure that the compiler could be easily maintained and extended by the community.
Another reason was that, in 2012, the language was mainly being used in UI development tasks instead of compute-intensive applications.
However, as the language grew in popularity and complexity, the compiler’s performance became a bottleneck for many developers. In large projects, the TypeScript compiler started taking a significant amount of time to build and compile code. This was especially true for projects with millions of lines of code and complex type systems. For instance, from the data presented in the official blog post, the VS Code repo (with more than 1.5 million lines of code) was taking 77.8 seconds to compile – not an insignificant amount of time!
This prompted the maintainers to look for ways to improve the compiler’s performance. They decided to port the compiler to Go and called the entire effort ‘Project Corsa’. After the port, the same repository was compiled in just 7.5 seconds!
These benefits are not just limited to large repositories like VS Code but are expected to be seen across the board. For another reference, the rxjs repo (with about 2100 lines of code) was taking 1.1 seconds, which was reduced to 0.1 seconds after the port.
This means that we are literally seeing a 10x improvement in the compilation times across the whole spectrum of projects.
One question that might come to your mind is: How was a large codebase like TypeScript built from the ground up in a new language like Go so quickly? Well, it wasn’t.
The elegance of this solution is that the compiler was not re-written from scratch. Instead, all the code in the repository was programmatically translated into its Go equivalent. This means that the various parts that make up the TypeScript compiler — like the scanner, parser, binder, and type checker — were all “lifted and shifted” to Go.
One of the TypeScript maintainers shares more details about why porting was chosen as the approach here. The main reasons are:
Another advantage of this approach is that most of the code can be ported over to Go with automated scripts and only the critical parts can be re-written. This shows when we compare the respective files from both the codebases. For instance, this is what a helper method called reportCircularityError
looks like in the checker.ts
file in the TypeScript codebase:
function reportCircularityError(symbol: Symbol) { const declaration = symbol.valueDeclaration; // Check if variable has type annotation that circularly references the variable itself if (declaration) { if (getEffectiveTypeAnnotationNode(declaration)) { error(symbol.valueDeclaration, Diagnostics._0_is_referenced_directly_or_indirectly_in_its_own_type_annotation, symbolToString(symbol)); return errorType; } // Check if variable has initializer that circularly references the variable itself if (noImplicitAny && (declaration.kind !== SyntaxKind.Parameter || (declaration as HasInitializer).initializer)) { error(symbol.valueDeclaration, Diagnostics._0_implicitly_has_type_any_because_it_does_not_have_a_type_annotation_and_is_referenced_directly_or_indirectly_in_its_own_initializer, symbolToString(symbol)); } } else if (symbol.flags & SymbolFlags.Alias) { const node = getDeclarationOfAliasSymbol(symbol); if (node) { error(node, Diagnostics.Circular_definition_of_import_alias_0, symbolToString(symbol)); } } return anyType; }
This is the equivalent method in the checker.go
file in the Go codebase:
func (c *Checker) reportCircularityError(symbol *ast.Symbol) *Type { declaration := symbol.ValueDeclaration // Check if variable has type annotation that circularly references the variable itself if declaration != nil { if declaration.Type() != nil { c.error(symbol.ValueDeclaration, diagnostics.X_0_is_referenced_directly_or_indirectly_in_its_own_type_annotation, c.symbolToString(symbol)) return c.errorType } // Check if variable has initializer that circularly references the variable itself if c.noImplicitAny && (!ast.IsParameter(declaration) || declaration.Initializer() != nil) { c.error(symbol.ValueDeclaration, diagnostics.X_0_implicitly_has_type_any_because_it_does_not_have_a_type_annotation_and_is_referenced_directly_or_indirectly_in_its_own_initializer, c.symbolToString(symbol)) } } else if symbol.Flags&ast.SymbolFlagsAlias != 0 { node := c.getDeclarationOfAliasSymbol(symbol) if node != nil { c.error(node, diagnostics.Circular_definition_of_import_alias_0, c.symbolToString(symbol)) } } return c.anyType }
Notice how each line can be compared and mapped to its equivalent in the Go codebase. This is the power of the porting approach.
When the performance aspect was identified as a bottleneck, the TypeScript team started looking into ways to improve the compiler. As explained by Anders Hejlsberg in this video, they considered multiple languages: Rust, C# (which is Microsoft’s own in-house favorite), and Go. There were several pros and cons for each language.
Rust is a systems programming language known for its performance and safety. It’s growing in popularity among developers for performance-critical applications, so it could have been a great option for this project.
C# is Microsoft’s own language and is used in many of its products. If it had been chosen, it would have allowed the team to leverage the existing knowledge and tools.
The team ultimately chose Go as the language to port the TypeScript compiler. Go is known for its fast compilation times and low memory usage. Some other technical reasons for choosing Go include:
However, the more important reason for choosing Go is its semantic similarity to TypeScript and the “portability” that we saw in the previous section. This was a key factor in the decision-making process. More details about the team’s decision process can be found here.
Hejlsberg mentioned that most of the parts related to porting the compiler are complete, while the type checker is about 80 percent complete.
Active development is now focused on the language service. While most of the performance gains are attributed to using a native language like Go, the rest of the improvements come from other fine-tuning that the team is doing. One such improvement is leveraging concurrency – for instance, running four instances of the type checker instead of just one. Another boost comes from re-architecting the language service to better align with the Language Server Protocol.
The Language Server Protocol (LSP) is now widely used by modern language services. However, when TypeScript was originally created, LSP didn’t exist. Porting TypeScript to Go gives the team an opportunity to re-architect the language service to better align with the LSP
One of the main benefits of this transition is improved performance, with no extra effort required from developers. Once a developer updates to TypeScript v7, the new Go-based compiler will automatically be used when running tsc
.
A faster language service (which VS Code uses for providing IntelliSense, code navigation, etc.) is also expected to be a part of the benefits. This will result in faster editor startup times and better responsiveness.
One of the immediate benefits that developers can expect is faster build times. This is especially true for CI/CD pipelines where the build times can be significantly reduced. This will result in faster feedback loops and quicker deployments.
When a huge TypeScript repo is loaded in a code editor like VS Code, we notice that there is a significant delay in the time it takes for the editor to load the files, set up the links between files, and provide IntelliSense. With the new compiler, this is expected to be significantly faster. Even the linting process is expected to be faster, which would make the red squiggly lines (a sight every developer detests) appear faster.
Another area where developers can expect to see improvements is in the hot reload times. When a developer makes a change in the code and saves it, the time it takes for the changes to reflect in the browser is expected to be faster. This is because the compiler can process the incrementally changed files faster.
The TypeScript v5.9 release is expected soon, with the codebase continuing to be written in TypeScript through the v6.x versions. During this phase, some features will be deprecated, and some breaking changes will be introduced to prepare for the transition to a native compiler. The fully native Go-based compiler is anticipated to be released with TypeScript v7, following the completion of the v6.x series.
What Hejlsberg and the team have accomplished here is the TypeScript equivalent of running a four-minute mile. It’s a significant milestone in the history of TypeScript. While the performance benefits are sure to trickle down to the developers and the ecosystem, this will also inspire other libraries in the TypeScript ecosystem to push the limits of what is possible.
I don’t know about you, but this sure makes me excited about the future of TypeScript. What a time to be alive!
LogRocket is a frontend application monitoring solution that lets you replay problems as if they happened in your own browser. Instead of guessing why errors happen or asking users for screenshots and log dumps, LogRocket lets you replay the session to quickly understand what went wrong. It works perfectly with any app, regardless of framework, and has plugins to log additional context from Redux, Vuex, and @ngrx/store.
In addition to logging Redux actions and state, LogRocket records console logs, JavaScript errors, stacktraces, network requests/responses with headers + bodies, browser metadata, and custom logs. It also instruments the DOM to record the HTML and CSS on the page, recreating pixel-perfect videos of even the most complex single-page and mobile apps.
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowUse Flutter to build browser-based app demos that help clients visualize the product, speed up buy-in, and close deals faster.
Learn how to manage JavaScript closures in React and explore real-life examples of proper JavaScript closure management.
By building these four simple projects, you’ll learn how CSS variables can help you write reusable, elegant code and streamline the way you build websites.
Explore AI’s impact in software development, its limitations, and how developers can stay competitive in the AI-driven industry.