Right now, before a bridge or a building or an aircraft is built, it goes through a rigorous series of computer tests, that simulate stress in a thousand different ways. This simulation technology has not only resulted in vastly safer engineering in all aspects of life; but has sped up the process from pre-production to completion. So why not start applying this same technology to the human body?
A fully analogue digital human would have literally tens-of-thousands of applications; most notably, it would speed up medical testing, treatments, and pharmaceutical research. That means more drugs that are much safer in less time. A digital human could test a new drugs effectiveness, as well as long term effects, all within a matter of minutes. The digital human could be programmed to be giving random assortments of drugs, millions every second (while the computer reads its effects); new drugs can be developed, not by careful research, but by billions of simulations and trial and error (all of which happens, of course, in a small amount of time) until it reaches a desired outcome; it’s a somewhat un-eloquent solution to health problems, but it might prove effective.
This is not all it could be used for. Medical students can get advanced training without having to meet real patients; safety of new vehicles can be tested with a lot more accuracy; forensic science (which is now investing in animation) can improve greatly; even video games will benefit. The applications are endless.
But this technology has major roadblocks ahead of it: most notably, the lack of knowledge*. Complicated computer simulations are only as good as the knowledge one has of what it is simulating; for years now, scientists and programmers have attempted to create a digital analogue for climate; but none have shown any real success in predicting real whether; the variables are too great and our understanding too small. Quite simple, we don’t understand enough to create a digital analogue of the human body.
But we don’t need to start by creating a full digital human body; a better approach might be to build up to it. Start by simulating something in a limited environment. For instance, algae cultures; and move on from there.
Or you can start more broadly; computer systems are many times better at larger trends then small interactions, working big, and then becoming smaller might be the best way to go. A working digital liver could be created which replicates the functions (imperfectly) of a real liver, as more knowledge is gained, it can be expanded and become more complex, until it is a reasonable analogue.
It would take a great deal of trail and error; and some hardcore research, to make this technology come to fruition. This technology, once under develop, would need to improve with time (it could actually be developed in conjuncture with research as to how the body works). Though take comfort that this technology is very plausible; in fact, it’s likely, it’s just a matter of when.
I have looked online, and there are few people talking about this technology and even less working on it. I managed to find this paper, which is an outline as to why it is beneficial, though it is un-sourced and not very helpful.
Whoever develops this stands to benefit from it greatly, not just financially, but by living in a world where medical and engineering technology are accelerated. I hope, for the sake of vastly better technologies, some smart people begin to work seriously on this massive undertaking within the foreseeable future.
*Even if you solve the major problem of limited knowledge; there are still many problems one has to overcome. Processing power is too limited right now; even large computer would have a hard time computing the reactions within a single human cell (millions upon millions of replicating DNA building blocks interacting, for instance) In a human body with over a trillion cells, all affecting each other, that’s an indescribable amount of information. Though compute power is increasing rapidly, so this is not a major problem.
Communication between the people developing the digital human and the researchers and doctors might be a problem as well.
It will also be unclear, in the early stages of development, weather one has truly created an analogue; or if it behaves in such a way that the ‘inputs’ and ‘outputs’ produce the same results in certain ways as the real thing.
But all these problems pale in comparison too the first and biggest one.