There is no doubt that TypeScript has enjoyed a huge adoption in the JavaScript community, and one of the great benefits it provides is the type checking of all the variables inside our code. It will check if performing any operation on a variable is possible given its type.
Most people think that by using TypeScript as their application language, they are โcoveredโ from any emptiness error, like the classic โundefined is not a functionโ or, my favorite, โcanโt read property X of undefined.โ This assumption is wrong, and the best way to demonstrate it is with code!
I gave a talk on this topic at the TypeScript Berlin Meetup. This article and the talk cover the same content, so you can use either to learn about this topic!
The following example does not present any TypeScript error.
// Typescript definition type ExampleType = { name: string, age?: number, pets: { name: string, legs: number, }[], }; // communicates with external API const fetchData = (): Promise<ExampleType> => {}; const getBiped = async () => { const data = await fetchData(); console.log(data); // { name: 'John' } return data.pets.find(pet => pet.legs === 2); // Boom! };
The snippet contains:
ExampleType
โ a type definition with two properties required, name
and pets
, and one optional property, age
. The property pets
is an array of objects with name
and legs
, both requiredfetchData
โ a function to retrieve data from an external endpointgetBiped
โ another function that will call fetchData
, iterate over the pets
, and return only the pets
with two legs
So, why will my script fail when I execute it? The reason is because the external API is returning an object that doesnโt contain pets
inside, and then when you try to execute data.pets.find()
, you will receive the error Uncaught ReferenceError: Cannot read property 'find' of undefined
.
In the official React documentation, you can find a very nice definition of what TypeScript is:
TypeScript is a programming language developed by Microsoft. It is a typed superset of JavaScript and includes its compiler. Being a typed language, TypeScript can catch errors and bugs at build time, long before your app goes live.
Given that definition, itโs possible to formulate a new assumption:
TypeScript performs static type validation. Developers should take advantage of dynamic validations.
Simply put, no! ๐
Checking all the variables in our application is time-consuming from both a development and performance perspective. A nice rule of thumb to follow is:
Validate all the external sources of your application.
External sources are everything that is external or doesnโt have access to your application. Some examples:
An application will always present at least one external source, otherwise, it would very likely be useless. Therefore, letโs take a look at how you can write validations for your objects in TypeScript.
To keep things simple, the original snippet will be considered the base, and on top, I will show how to implement each of the validation methods.
The most basic validation, itโs a set of conditions that check whether the structure is the expected one.
const validate = (data: ExampleType) => { if (!data.pets) return false; // perform more checks return true; }; const getBiped = async () => { const data = await fetchData(); console.log(data); // { name: 'John' } if (!validate(data)) throw Error('Validation error: data is not complete ...'); return data.pets.find(pet => pet.legs === 2); };
As you can see, a new function has been defined, called validate
, which receives as a parameter an ExampleType
object, with which it checks whether the property pets
is defined. If it is not, it will return false
, which will end up throwing an error with a description message. Otherwise, it will continue the execution, and now, when evaluating data.pets.find
, it wonโt throw an error.
Be aware that the implementation of the validate
function is quite simple, and there is room for many more checks, such as:
name
should existname
should be a string
age
exists, it should be a number
pets
should be an array
of objects
pet
object should have props name
and legs
The more checks you add, the more robust your application will be โ but the more time you need to invest, too.
The advantages of this method are:
propertyA
shouldnโt exist if propertyB
is presentIt also presents some disadvantages:
ExampleType
already defines that there is a pets
property, and that it is required. But again, inside the validation code, you should still check that itโs trueWhy reinvent the wheel, right? This method consists of using any validation library to assert the structure of the objects. To name some of the most used libraries:
The validation library used for this article is ajv
; nevertheless, all the conclusions also apply to the other libraries.
const Ajv = require('ajv'); const ajv = new Ajv(); const validate = ajv.compile({ properties: { name: { type: 'string', minLength: 3, }, age: { type: 'number' }, pets: { type: 'array', items: { name: { type: 'string', minLength: 3, }, legs: { type: 'number' }, }, }, }, required: ['name', 'pets'], type: 'object', }); const getBiped = async () => { const data = await fetchData(); console.log(data); // { name: 'John' } if (!validate(data)) { throw Error('Validation failed: ' + ajv.errorsText(validate.errors)); // Error: Validation failed: data should have required property 'pets' } return data.pets.find(pet => pet.legs === 2); };
Many validation libraries force you to define a schema
wherein you can describe the structure to evaluate. Given that schema, you can create the validation function that will be used in your code.
The declaration of your schema will always depend on the library you are using; therefore, I always recommend checking the official docs. In the case of ajv
, it forces you to declare in an object style, where each property has to provide its type
. Itโs also possible to set custom checkers for these values, like minLength
for any array
or string
.
This method provides:
schema
is to have only one way to check for specific conditions inside your application โ especially in JavaScript, where there are many ways to accomplish the same task, such as checking the length
of an array. This quality is great to improve communication and collaboration inside a teamThis new way of creating validations presents the following drawbacks:
schema
and ExampleType
is disconnected, which means that every time you make a change inside the ExampleType
, you have to manually reflect it inside the schema
. Depending on how many validators you have, this task can be quite tediousOne small comment regarding keeping validators and types in sync: some open-source projects address this issue, such as json-schema-to-typescript, which can generate a type definition from an existing schema. Then this wonโt be considered a problem.
This is the method I want to talk about, and it represents a change in paradigm regarding how to create validators and keep types in sync.
In the two other methods, the validator and the type can be seen as different entities: the validator will take the incoming object and check its properties, and the type statically belongs to the object. Combining both entities, the result is a validated type object.
Dynamic type validation allows a type to generate a validator from its definition. Now they are related โ a validator depends entirely on a type, preventing any mismatch between structures.
To generate these validators, I found an amazing open-source project called typescript-json-validator
, made by @ForbesLindesay. The repo description states that is goal is to โautomatically generate a validator using JSON Schema and AJV for any TypeScript type.โ
For the test, letโs reuse the ExampleType
definition, which now has been moved to a separate file inside the types
folder.
// src/types/ExampleType.ts type ExampleType = { name: string; age?: number; pets: { name: string; legs: number; }[]; };
This library exposes a handy CLI that can be called from anywhere, and given a file path and the name of the type, it will generate โ in the same location as the file โ a new file with the validator code.
> npx typescript-json-validator src/types/ExampleType.ts ExampleType # ExampleType.validator.ts created!
The resulting validator can be a very long file, so letโs take a look piece by piece.
ajv
instanceIt also sets some default configuration for ajv
.
/* tslint:disable */ // generated by typescript-json-validator import { inspect } from 'util'; import Ajv = require('ajv'); import ExampleType from './ExampleType'; export const ajv = new Ajv({ allErrors: true, coerceTypes: false, format: 'fast', nullable: true, unicode: true, uniqueItems: true, useDefaults: true, }); ajv.addMetaSchema(require('ajv/lib/refs/json-schema-draft-06.json')); export { ExampleType };
schema
from the typeThis is the key of this approach.
// Definition of Schema export const ExampleTypeSchema = { $schema: 'http://json-schema.org/draft-07/schema#', defaultProperties: [], properties: { age: { type: 'number', }, name: { type: 'string', }, pets: { items: { defaultProperties: [], properties: { legs: { type: 'number', }, name: { type: 'string', }, }, required: ['legs', 'name'], type: 'object', }, type: 'array', }, }, required: ['name', 'pets'], type: 'object', };
schema
It also takes care of throwing an exception in case there is an error.
export type ValidateFunction<T> = ((data: unknown) => data is T) & Pick<Ajv.ValidateFunction, 'errors'>; export const isExampleType = ajv.compile(ExampleTypeSchema) as ValidateFunction< ExampleType >; export default function validate(value: unknown): ExampleType { if (isExampleType(value)) { return value; } else { throw new Error( ajv.errorsText( isExampleType.errors!.filter((e: any) => e.keyword !== 'if'), { dataVar: 'ExampleType' }, ) + '\n\n' + inspect(value), ); } }
To use the validator, you just need to import from the respective path and call it. Be aware that this function is already checking whether there were any errors inside the object; therefore, itโs not necessary to add an if
statement here, making the code much cleaner.
import validate from 'src/types/ExampleType.validator'; const getBiped = async () => { const data = validate(await fetchData()); return data.pets.find(pet => pet.legs === 2); };
ajv
This library uses ajv
under the hood to create the validator function, which means you can make use of all the nice features it provides, such as custom validation for types.
Letโs create a new definition type for ExampleType
.
interface ExampleType { /** * @format email */ email?: string; /** * @minimum 0 * @maximum 100 */ answer: number; }
Above each property youโll find some annotations made inside comment brackets. These will be translated into ajv
rules when the library generates the final schema. This is the result:
export const ExampleTypeSchema = { $schema: 'http://json-schema.org/draft-07/schema#', defaultProperties: [], properties: { answer: { maximum: 100, minimum: 0, type: 'number', }, email: { format: 'email', type: 'string', }, }, required: ['answer'], type: 'object', };
The property answer
presents now two more attributes that will check whether the number
is between 0 and 100. In the case of email
, it will check whether the string
value belongs to a valid email address.
As these annotations are wrapped inside comments, they donโt present any conflict with the TypeScript compiler.
This method is based on the idea that the developer will run the CLI command and generate the validators; otherwise, thereโs a possibility that the schema was generated with an older version of the type, which can then present mismatches.
Fixing this issue is quite easy: you simply have to add a script that will be executed before your code will run. You can call it prebuild
or prestart
, and this is how your package.json
might look:
{ "scripts": { "prebuild": "typescript-json-validator src/types/ExampleType.ts ExampleType", "start": "yarn prebuild && ts-node start.ts", "build": "yarn prebuild && tsc" } }
One last recommendation: ignore any validator.ts
file from your project. There is no point in committing these files to your repository since they are going to be generated every time you start your project.
About two months ago, I open-sourced one of my side projects called gatsby-starter-linkedin-resume
.
In summary, itโs a Gatsby starter that can retrieve your information from LinkedIn, using a LinkedIn crawler, and generate an HTML and PDF resume from it using JSON Resume.
The project presents two main flows:
At the beginning of this article, I mentioned that itโs advisable to validate your external sources. For this project they are:
These are the type definitions for each case:
interface LinkedInSchema { contact: ContactItem[]; profile: ProfileData; positions: LinkedInPosition[]; educations: LinkedInEducation[]; skills: Skill[]; courses: Course[]; languages: LinkedInLanguage[]; projects: LinkedInProject[]; } interface JsonResumeSchema { basics: JsonResumeBasics; work: JsonResumeWork[]; volunteer?: JsonResumeVolunteer[]; education: JsonResumeEducation[]; awards?: JsonResumeAward[]; publications?: JsonResumePublication[]; skills?: JsonResumeSkill[]; languages?: JsonResumeLanguage[]; interests?: JsonResumeInterest[]; references?: JsonResumeReference[]; projects?: JsonResumeProject[]; }
Both types present similarities in terms of variable names, but their internal structure differs. This is why itโs necessary to transform from one structure to the other on the first flow.
After I set up my project to generate the validators from these types, checking the structure of the incoming object was a very easy task.
// src/index.ts import { RESUME_PATH, LINKED_IN_PATH } from './utils/path'; import validateLinkedInSchema from './types/LinkedInSchema.validator'; import { saveJson, readJson } from './utils/file'; import { inquireLoginData, getLinkedInData } from './utils/linkedin'; // โ๏ธโ๏ธ IMPORT OF THE VALIDATOR โ๏ธโ๏ธ import mapLinkedInToJSONResume from './utils/mapLinkedInToJSONResume'; export const main = async ({ renew }) => { if (renew || !readJson(LINKED_IN_PATH)) { const credentials = await inquireLoginData(); const linkedInData = await getLinkedInData(credentials); saveJson(LINKED_IN_PATH, linkedInData); } // โ๏ธโ๏ธ VALIDATION IN ACTION โ๏ธโ๏ธ const linkedInParsed = validateLinkedInSchema(readJson(LINKED_IN_PATH)); const jsonResumeData = mapLinkedInToJSONResume(linkedInParsed); saveJson(RESUME_PATH, jsonResumeData); };
// gatsby-config.js const { existsSync } = require('fs'); // โ๏ธโ๏ธ IMPORT OF THE VALIDATOR โ๏ธโ๏ธ const { default: validateJsonResume, } = require('./lib/types/JsonResumeSchema.validator'); if (!existsSync('./resume.json')) { throw new Error( 'Please run "yarn generate-resume" to generate your resume information.', ); } // โ๏ธโ๏ธ VALIDATION IN ACTION โ๏ธโ๏ธ const resumeJson = validateJsonResume(require('./resume.json')); module.exports = { plugins: [ { resolve: 'gatsby-theme-jsonresume', options: { resumeJson, }, }, 'gatsby-plugin-meta-redirect', ], };
To sum it all up, I created this table comparing the three methods. The dynamic types approach grabs the best of the other methods, making it the recommended approach to validate your object.
Approach | No additional syntax | Validators and types sync | Standardization |
Manual | โ | โ | โ |
Library | โ | โ | โ |
Dynamic types | โ | โ | โ |
If you are working in a TypeScript codebase, I recommend you give this new method of validating your objects a try. Itโs very easy to set up, and in the event you donโt find it useful, removing it from the codebase is as easy as removing an import
from your files.
LogRocket is a frontend application monitoring solution that lets you replay problems as if they happened in your own browser. Instead of guessing why errors happen or asking users for screenshots and log dumps, LogRocket lets you replay the session to quickly understand what went wrong. It works perfectly with any app, regardless of framework, and has plugins to log additional context from Redux, Vuex, and @ngrx/store.
In addition to logging Redux actions and state, LogRocket records console logs, JavaScript errors, stacktraces, network requests/responses with headers + bodies, browser metadata, and custom logs. It also instruments the DOM to record the HTML and CSS on the page, recreating pixel-perfect videos of even the most complex single-page and mobile apps.
Would you be interested in joining LogRocket's developer community?
Join LogRocketโs Content Advisory Board. Youโll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowLearn how to manage memory leaks in Rust, avoid unsafe behavior, and use tools like weak references to ensure efficient programs.
Bypass anti-bot measures in Node.js with curl-impersonate. Learn how it mimics browsers to overcome bot detection for web scraping.
Handle frontend data discrepancies with eventual consistency using WebSockets, Docker Compose, and practical code examples.
Efficient initializing is crucial to smooth-running websites. One way to optimize that process is through lazy initialization in Rust 1.80.
3 Replies to "Dynamic type validation in TypeScript"
In frontend we are very good in making stuff ultra complex ๐
Your “typescript isse” is not a typescript issue, because you simply dont type your service. For that you can use autogenerating their DTO into types, libs like ‘dtsgenerator’ just need the open api spec of the resource api. Problem solved. Just automate more, write less code and dont inject another dep which is only overhead.
Nice post,
I’m the author of this project [typescript-field-validation](https://www.npmjs.com/package/typescript-field-validation) which aims to solve (some of) these issues. I’ve taken a different approach, rather than validate the incoming data against an openAPI schema, simply constrain the fields you need to use as non nullable and assume the value must be the resulting type as defined in the specification. Yes it defines new syntax, but it’s just dot notation and the same array syntax we already use in Typescript, no big deal. Compared to some of the above mentioned packages, this simple tool might be all you need to wrangle those enormous auto-generated schema types into constrained types with non optional and non nullable fields. At a minimum performing these checks upfront could help clean up hundreds of null checks peppered throughout many code bases.
I like the approach,
My case is little bit different I generate all of the types from the open api.This is huge help and covers generates all of the necessary types. How ever, we have a form of which data needs to be eventually typed to RestCreateType. With dynamic validation I can take RestCreateType generate validation schema and avoid additional work and lower code dependency on generated types.