It is not usually needed - to be able to develop the same code from multiple computers. Mostly it is easy - we have version control for this. Git push on one machine and pull on another. All good. Enough for web and backend development.
But there is a small number of cases when the projects are a bit too complex for this and involve the installation of system packages, config files, ssh keys, datasets, setting env variables for different branches. It is still possible to sync both machines, but now you will probably spend an evening.
This happened to me with machine learning/data science projects. Such projects typically require a lot of experiments to train the most optimal ML model. And all of these experiments need to be reproducible. So all of the configs, settings, for each experiment should be saved in a different branch.
I started developing directly inside the docker images. They can be committed pushed to a private registry and pulled back on another machine. Having it in docker allowed us to isolate environments, try something new in a separate environment without creating a mess in a single environment. And then I was able to move the entire environment from laptop to powerful PC.
I made a small solution by putting browser-based VS-code version, terminal, scheduler, and file browser in a single docker image. This way I made an isolated, movable, and shareable environment. I described in this article how to move such environments between computers