2 Clarke Drive
Suite 100
Cranbury, NJ 08512
© 2024 MJH Life Sciences™ and OncLive - Clinical Oncology News, Cancer Expert Insights. All rights reserved.
Groups of researchers are working on a big fix for the Internet. But some want to replace it entirely. If you think it doesn't make sense, think again. It could reshape the way the healthcare industry conducts its e-business.
According to Dr. Darleen Fisher, Program Director in the National Science Foundation’s (NSF) Directorate for Computer and Information Science and Engineering, Division of Computer and Network Systems, Network Systems Cluster, the size to which the Internet has grown, and the inherent complexity of such size, is overwhelming its capabilities to eff ectively handle the charges with which it is currently tasked. “But despite its robustness and ability to adapt to ever-increasing demands, it was never designed with the responsibility of supporting our critical infrastructures in mind,” comments Fisher. Anyone who has been negatively aff ected by spam, viruses, or simply worried about the security of private data when paying a bill online can see that ineffi ciencies plague the Web even at its most basic level, hindering the ability of some industries to make full use of the Internet. This is especially true of healthcare because of issues surrounding the private data of patients.
For these reasons, it has been proposed by Internet researchers, members of academia, and the United States government that the Internet is due for a full-scale makeover, one that will result in something faster, less cluttered, and more secure.
In the Beginning
Many involved in this debate over the Internet’s future believe that any problems attributable to it today stem from the Internet’s decades-old architecture. Sponsored by the Department of Defense Agency ARPA (today known as DARPA), the Internet’s predecessor, ARPANET, was developed in the late 1960s and early 1970s as a research tool to facilitate communication between ARPA-sponsored researchers. ARPANET was built on an open architecture platform designed to accommodate a group of users numbering only in the thousands. It also lacked safeguards preventing anonymous, malicious activity such as spamming or phishing. Th us, as communications-centered Internet technologies fl ourished, so consequently did the Internet’s dark side. More importantly, while risks to private information grew with the volume of users, the reasons people utilized the Internet began to change.
“The current Internet was designed for delivery of data, but new content-delivery systems are emerging as the dominant applications,” says Fisher. “They’re producing about 80% of Internet traffi c, yet they run over a network structure designed with diff erent applications in mind.”
Internet2 and the Clean Slate Design
Eff orts to “reclaim” the Internet have been underway since the mid-1990s and generally fall under one of two umbrellas. Th e fi rst is the research consortium, the most well known in the United States being Internet2; others include GEANT2, SURFnet, Renater and CANARIE. Research consortiums operate on a model similar to that of ARPANET and foster high-speed, secure communication between members of academia, government agencies, and corporations. Others believe the Internet has fundamental architectural issues and that starting over on a clean slate is the best way going forward. They include Dr. Dipankar Raychaudhuri, a professor in the Department of Electrical and Computer Engineering at Rutgers University and the Director of the University’s Wireless Information Network Laboratory (WINLAB).
“I believe that research on clean slate protocols is important because fundamental innovation cannot be done with the constraint of backward compatibility,” says Raychaudhuri in conversation with MDNG. “Th e Internet’s design has remained relatively static for approximately 20 years, and it is reasonable to consider a complete redesign.” To that end, the NSF instituted the Future Internet Design (FIND) program as part of its Network Technology and Systems (NeTS) initiative to “envision what a desirable network should be and how it should be designed.” Going even further is another NSF-led initiative called the Global Environment for Network Innovations (GENI). Whereas FIND is only one aspect of NeTS, GENI was created specifically to “give researchers the opportunity to experiment unfettered by assumptions or requirements and to support those experiments at a large scale with real user populations.”
Healthcare: How Do I Benefit?
Proponents of both the Internet2-style and clean slate approaches to Internet reform may differ in their means, but are certainly in concert on the ends—“Fix the Internet!”
So then, if there is a New Internet out there waiting to be developed, what implications would it have for the healthcare industry? Security — A new Internet would most likely include basic changes to Internet protocol, which according to Raychaudhuri, many network architects feel will eliminate the security and privacy problems associated with today’s Internet protocol. Th ese changes could revolutionize electronic communication between patient and physician. With a reduced security risk, insurance companies might even begin to pay for such communication under their plans. Mobility – TCP/IP [the basic communication language of the Internet] doesn’t work well with mobile devices connected by wireless channels. For example, a disconnection results in a complete restart of the transmission control protocol (TCP) and application-level timeout. “A new Internet protocol design could build in features that deal with occasional disconnection and end-user mobility without having to restart all our applications as we move around,” Raychaudhuri says. Th is would allow for better connections and more robust mobile applications. Raychaudhuri notes that he is starting to see an increasing number of trials and experiments involving sensors and wireless networking for emergency room triage and patient monitoring in hospitals.
Remote Medicine — A new Internet will increase the quality and effi ciency of current remote medicine eff orts, creating a better standard of care while also expanding the range of possibilities to which remote medicine can be applied. A new Internet, by virtue of high-speed networks, will also solve any issue regarding bandwidth and latency. Th e Internet2 Network has a capacity of 100 gigabytes per second, more than enough to handle high-quality video and medical imagery.
A new Internet would also be smarter, allowing data to be communicated via the multicast model. Unicasting, the dominant data transfer model of today’s Internet, uses bandwidth unnecessarily by creating separate copies of entire data sources for each individual user on a network. Multicasting transfers data as a single source, creating copies only when necessary. Dr. Fisher points out that 80% of the Internet’s traffi c is generated by applications that the Internet wasn’t originally designed to handle. These functions—heavy data transfer, robust security, content management—would be inherent to the new Internet’s architecture and, in turn, would not require add-ons or modifi cations to work properly. A new Internet would simplify the implementation of certain technologies in the medical fi eld, make their use more effi cient, and expand the options available to physicians.
The Look of the Future
While the debate rages over whether the Internet should be created anew or the current version merely modifi ed, Raychaudhuri and Fisher believe that the process will most likely not be so black and white. Raychaudhuri indicates that emerging protocols could simply be layered on top of today’s protocols, which would then gradually disappear. “This is no different from what happened to telephone circuits initially used to carry data,” he says. “Clever engineers will come up with evolutionary deployments of clean-slate ideas without having to throw away most of the infrastructure equipment.” “Th e NSF expects that, at the very least, the results of clean-slate research will be to train a new generation of network researchers to think about designing and deploying big systems, with new capabilities and applications,” says Fisher. And although some of those ideas may not be applicable within the parameters of today’s Internet, “we expect that the results of promising research from the FIND projects will impact the current Internet in much the same way as did work a decade or more ago.”
Bradley Schmidt is a freelance author and former MDNG editor.