Data centers are the physical internet

From CFACT

By David Wojick

Building new data centers has become highly controversial. This massive public debate suffers from a major confusion in that what data centers do is seldom mentioned, except that they do AI, which is also controversial.

The vague name “data centers” itself contributes to the confusion. It sounds like they just store a lot of data, and what good is that?

In reality, they should be called internet centers, because much of the internet’s essential processing occurs there. Internet processing is the primary function of data centers. You cannot logically love the internet and hate data centers as they are the same thing. A data center is a chunk of the internet in a box.

The big problem is that most people have no idea how the internet works, so here is a highly simplified look at it.

The core process is called packet switching, and it is amazing. It was originally developed as a “bomb proof” defense communication technology because message traffic takes no fixed route that can be knocked out.

To take a simple case, let’s say you email someone an Acrobat document. You might think it just travels to them like physical mail, but it is very different. First, the email and the document are broken down into small pieces called packets of data. Keep in mind that to a computer everything is just a structured bunch of ones and zeros, so it can readily be taken apart and put back together.

Then, come specialized computers called routers, which data centers are packed with. The sending computer puts out a call for available routers, which respond from many data centers. It picks one and sends it a packet. This process continues until all the packets have been sent to routers. These routers repeat this forwarding process until all the packets reach the final destination. Here the computer reassembles the email and the document so it can be viewed.

The immense value of this complex technology is that it does not require dedicated lines from sender to receiver. This routing process is also true for looking at and communicating with World Wide Web stuff that is running on a server computer somewhere. The web page you are looking at is sitting as ones and zeros on a server somewhere, sending you a picture version of itself.

Data centers are where most of the internet’s routers, switches, and servers are housed. The internet comes in big boxes. So, while you might not want a big box of internet in your neighborhood, you cannot reasonably use the internet and say data centers are a bad thing.

As for AI, it is not the first big technology to make the internet grow rapidly over the last fifty years. Personal computers and smart phones are two prior examples.

AI is extremely useful for certain internet tasks, such as searching and analyzing large numbers of documents. It saves me a lot of search time that I can then spend doing more research. AI can greatly improve the productivity of certain tedious tasks. Its growth is certain.

Mind you, the tremendous rapid growth in data centers being touted and causing great anxiety is likely a bubble of hyperbole. Like the dot-com investment bubble that burst in 2000, the projected growth is impossible. Growth will be sharply constrained by available electricity generating capacity.

But in the meantime, the public debate will no doubt continue. It would help if people understood that data centers are primarily the home of the physical internet, which needs to keep growing.

Calling data centers “internet centers” might help the policy debate.

Get notified when a new post is published.
Subscribe today!
5 8 votes
Article Rating
52 Comments
Inline Feedbacks
View all comments
January 27, 2026 2:15 am

Yes, clearly made point very worth making.

ResourceGuy
January 27, 2026 2:28 am

Rapid growth of huge behind the meter power stations will continue while the debate and blame game over impact on electricity rates spins on independent of facts but useful for politics.

David Wojick
Reply to  ResourceGuy
January 27, 2026 3:46 am

Behind the meter power does not work because getting a 100% capacity factor is too expensive.

Eng_Ian
Reply to  David Wojick
January 27, 2026 12:45 pm

It can work for off-grid domestic. In my case the expense of the connection to the grid was the defining point. It was cheaper to NOT buy all the power poles, cable, the transformer and the design fees than it was to install an off grid power supply.

BUT… this does not apply for anyone with the grid connection available just metres from their front door. In those cases, as you point out, it is too expensive to switch off completely. Besides, if it did get cheap enough to switch off, the government(s) would create a new billing structure so that everyone paid, even if they didn’t connect.

I give it 5 years before that becomes the law across the nations where the sun shines regularly throughout the year.

January 27, 2026 2:47 am

I don’t see the point of this ‘article’. A misunderstanding? So what?
Isnt the main issue simply limited energy and where it goes, whether you call it a data or computer centre?

Tom Halla
Reply to  ballynally
January 27, 2026 3:12 am

The level of technological illiteracy is astounding. As in, most green voters.

Reply to  Tom Halla
January 27, 2026 12:08 pm

Well, your level of misunderstanding trumps my ‘technological illiteracy’ by some degrees. How you get what you said out of what i said is rather amusing.

I should not really response but there you are. Or..maybe it is ME who is misunderstanding YOU and it was not meant as a reply to me.
The mind boggles..

Reply to  ballynally
January 27, 2026 12:33 pm

The level of technological illiteracy is truly astounding, green voters or not. When demand exceeds supply, of course the price increases. Price has a habit of allocating resources until distortions (politics or slight-of-hand) get involved. An example of a distortion (slight-of-hand) is: everyone thinks the internet is free, but it is not. Accessing the internet used to be $10/mo but is now over $100/mo. Few of us used the internet at $10/mo but now most everybody uses the internet for at least $100/mo (TVs, PCs, phones, etc).

Reply to  ballynally
January 27, 2026 3:34 am

Yes, It isn’t very clear. But what is driving power consumption is neither Internet routing nor data storage, it is actual computer programs used to analyse data, so you get the next useless gadget you never knew you didn’t want, presented as a must-have advertisement in your online surfing…

The unexpected upside of this is that big CPU centres are going to need nuclear power stations to run them, and the extra heat and electricity will be significant grid providers and useful for places needing low grade heat like office blocks or homes or greenhouses growing marijuana and lettuces in e,g., Alaska… 🙂

And there is so much money flooding into AI that politicians and green groups will be bought off, regulations will be slackened and those nuclear power stations will actually get built.

I suspect te ‘new industrial towns’ will be built where cooling water exists and no one has a stake already…places like the Arabian peninsula with lots of sea water and bugger all else,.. An integrated data centre and desalination plant could create an oasis…

And even failing post industrial towns, provided they are build on a river or lake could be revitalised with small nuclear power stations to provide cheap energy for raw industries like aluminium smelting and so on,. as well as data centres.

Conservatism is ultimately a policy of keeping what works, and only changing what is necessary to achieve a better life etc. etc.

So dumping woke, which has done no one any good, in favour of nuclear power would seem to be in the proper conservative tradition.

1saveenergy
Reply to  Leo Smith
January 28, 2026 8:55 am

Which Type of Conservatism ??

Liberal Conservatism: Combines conservative values with liberal economic policies.

Social Conservatism: Emphasises traditional social values and norms.

National Conservatism: Focuses on national identity and sovereignty, often sceptical of immigration.

Paternalistic Conservatism: Advocates for social responsibility and a safety net for the less fortunate.

Progressive Conservatism: Seeks to blend progressive reforms with conservative principles.

MarkW
Reply to  ballynally
January 27, 2026 6:56 am

The point is that too many people are starting to believe that AI and data center are synonymous. While AI’s require data centers, so do many other users.

Reply to  ballynally
January 27, 2026 8:25 am

I don’t see the point of this ‘article’. A misunderstanding? So what?
Isnt the main issue simply limited energy and where it goes, whether you call it a data or computer centre?

The point he is making is that you can have furious activists demonstrating against a site containing lots of computing and storage. Objecting on the grounds that all this data storage or AI or whatever it houses is destroying the planet by contributing to global warming because of its electricity use.

And not realizing that the internet they are using over their phones, their access to Amazon or Ebay, their streaming, their social media, its all running off such sites. Both the switching and connectivity that enables you to connect, and what you connect to. Its not like you can say here are switches and connectivity, that is OK, and here is storage and processing, and we want less of that. He is pointing out that Internet switching and routing is in fact data processing, and what the activists are demonstrating against is what enables them to, among other things, communicate with each other and organize their demonstrations.

This is why he goes into such specifics about packet switching.

Back in the day this did not use to be the case. You had great banks of first mechanical, then electro -mechanical switches. It really was true then that there was a sharp distinction in technology and site contents between switching, connectivity, and storage. Those days are gone. However, the mental model lives on in the minds of the modern uninformed who are self righteously and without realizing it demanding the impossible, that we close down the internet while at the same time keeping it going.

To save the planet.

Reply to  michel
January 27, 2026 12:15 pm

Im pretty sure those ( green) activists already live in a world of contradictions. It is not really ideologically sound so their narrative can contain clear contradictions.
However, some activists are actually concerned about who gets to decide who gets the energy. It’s legit to question the process involved.
If people don’t see the destinction they should open their eyes.
But people get triggered, narrow their mind and go full binary.

Nick Stokes
Reply to  michel
January 27, 2026 4:07 pm

And not realizing that the internet they are using over their phones, their access to Amazon or Ebay, their streaming, their social media, its all running off such sites.”

We had all those things ten years ago, when “such sites” barely existed.

old cocky
Reply to  Nick Stokes
January 28, 2026 2:10 am

You’re joking, right?
Right?

Reply to  Nick Stokes
January 28, 2026 7:04 am

1) “such sites” certainly DID exist, just not as many, and maybe not as big (I worked with 2 of them)

2) the internet is vastly larger than it was 10 years ago, necessitating more datacenters.

2hotel9
Reply to  ballynally
January 27, 2026 9:18 am

Really? You actually read that and don’t get it? Tell me, do you often have two black eyes at same time? Just curious. 😉

January 27, 2026 3:17 am

Well there is a nuanced difference between a dark office running internet routers and a fully equipped data centre, in that internet nodes store no data beyond that actually in transit.

There is a difference between the connectivity aspect of the Internet – wires fibres and routers – and the data that are accessible via the Internet. Servers and clouds.

That is. the computers running web servers, and centralised cloud technology. The latter are in fact what ‘data centre’ usually means.

Of course there is nothing to stop both functions going into one building….

observa
January 27, 2026 4:05 am

Well if the fickles aren’t up to scratch we can always load shed the internet centres-
AEMO data shows demand at peak of yesterday’s heat exceeded supply for South Australia – ABC listen
10:30PM and 72% dinojuice across the NEM grid-
AEMO | NEM data dashboard

January 27, 2026 4:54 am

“You cannot logically love the internet and hate data centers…”

Unless it’s built next to your neighborhood. They’re very noisy and will certainly lower your property value.

David Wojick
Reply to  Joseph Zorzin
January 27, 2026 6:24 am

Yes I mention that. They are industrial facilities so need to be sited accordingly.

Reply to  Joseph Zorzin
January 27, 2026 12:23 pm

Yes, that is a clear omission. Not only that, the issue of scale isnt mentioned either. Furthermore: the effort of trying to educate people usually doesnt work. Entrenched brains stay stuck. Its position is irrelevant.

David Roberts
January 27, 2026 4:59 am

There is a proclivity, for many of those of us who think about the future, to see constraints and dangers. Humanity did not get to where we are today without this aspect of our nature. However, certainly, for the topic of this thread, we should not forget Moore’s Law and Julian Simon’s human ingenuity.

Reply to  David Roberts
January 27, 2026 12:29 pm

And you also not forget that Moore’s law worked because believed it did.
And the tech fallacy in which new technology will sort things out. Both the Greens and anti Greens fall for this.
I wouldnt put my faith in the likes of Musk and Thiel to ‘sort it out’. These guys are potentially and in reality dangerous if given free reign. Im sure they have already pretty much worked out what to do w humanity.
Mars anyone?

MarkW
January 27, 2026 6:52 am

The description above is close, but not quite accurate.
What actually happens is when a packet is sent, it is sent it is transmitted to all computers that are connected to the sending computer. The header of each packet has four key factors, the address of the sending computer, the address of the target computer, a packet number and a hop count.
When a computer receives a packet, it looks at the target address, If the receiving computer is the target computer, it stores the packet and sends a response to the originating computer indicating reception. If the receiving computer is not the target computer, it decrements the hop count. If the hop count is not zero, it then sends the packet (with the new hop count) to all the computers it is connected to, and the cycle repeats. The receiving computer uses the packet number to reassemble the transmitted file.

Because of the nature of the internet, it is possible that the target computer will receive multiple copies of the same packet. It then selects the packet that took the fewest hops to arrive, and it sends the response to the computer that sent it that packet. Each computer a long the chain does the same thing. As a result the second and all subsequent packets are sent along the most efficient route, instead of being broadcast to the world.
If one computer in this chain stops communicating before all the packets are sent, the last computer in the chain restarts with the broadcast method until a new path is established.

There are still plenty of traditional data centers. The company that I work for has data bases that are 100’s of tera bytes in size. We have programs that process updates to that data, that run in a daily cycle that takes 5 to 6 hours to run, most of the time. We also have weekly, monthly, quarterly and yearly processes, all of which have to be scheduled so that they always run on different days. These processes take 2 ro 4 hours to run as well and most of them can’t start until the daily run has finished. THere is one report that I have worked on that takes about 4 hours to run all by itself.

Between the DB servers, the Linux servers, the MicroSoft servers, the routers, the load balancers and the firewalls we have close to 100 virtual machines.
Not only that but we maintain a backup site. Contractually, we are required to go live with our backup site withing 48 hours of the primaty going down. We have yearly disaster recovery excercises in which we prove we can do that.

MarkW
Reply to  MarkW
January 27, 2026 7:30 am

I don’t remember how this works, but there are also tricks that can be played to make sure that a packet that is sent, stays in the senders node or sub-node.

David Wojick
Reply to  MarkW
January 27, 2026 11:53 am

Thanks for this. My article target length is just 600 words so I have to greatly oversimplify.

Beta Blocker
Reply to  MarkW
January 27, 2026 11:56 am

See my comment here concerning what the advocates of distributed computing architectures are saying about how the power consumption of AI data centers can be greatly reduced.

Their basic position is that we don’t need AI-focused data centers to get the AI processing work done.

Their claim is that the processing can be done much closer to where the data actually resides thus reducing the number of I/O processing cycles needed to bring the data into the CPU while reducing the time the CPU spends idle waiting for data to chew on.

old cocky
Reply to  MarkW
January 28, 2026 2:51 am

when a packet is sent, it is sent it is transmitted to all computers that are connected to the sending computer. 

Yeah, nah.
UDP is a broadcast protocol, TCP is directed. If the target IP address is on the same subnet, as defined by the sender’s address and netmask, it is sent to that address. There’s a bit more to it with discovering the MAC address mapping, but that’s the gist of it.
Computers haven’t been on the same physical network segment since coax was replaced by twisted pair late last century.

If the target is on a different subnet, the packet is sent to the lowest cost gateway for the target subnet. Mostly, the gateway is the default for “everything which isn’t on my subnet”, but it is quite common to define gateways for specific subnets.

Routers have various protocols for network discovery, to find their lowest cost upstream peers, address mirroring, redundancy, etc. My team worked with very tightly controlled subnets state-wide, so the external routing was never our bailiwick.

Between the DB servers, the Linux servers, the MicroSoft servers, the routers, the load balancers and the firewalls we have close to 100 virtual machines.

It must be nice to work on a medium sized system.

Not only that but we maintain a backup site. Contractually, we are required to go live with our backup site withing 48 hours of the primaty going down.

48 hours. Looxury.

There are lots of other essential services running in DCs, as well, such as time servers, name servers, and mail servers

Doug S
January 27, 2026 8:07 am

Excellent post! Very good points thank you.

Dave Andrews
January 27, 2026 8:14 am

The IEA estimate around USD 580bn was invested in data centres worldwide in 2025 and there will be doubling of the amount of total electricity use by data centres by 2030. Even so they expect total electricity use by such centres to account for less than 10% of global electricity demand growth to 2030.

The US, China and Europe account for 82% of global data centre capacity and this will grow to 85% over the next few years.

In China and Europe data centres are expected to account for 6-10% of growth in electricity demand by 2030 whilst in the US they will account for 50% of electricity demand growth.

55% of data centres in the pipeline are larger than 200MW in capacity and would each consume as much electricity as 200,000 people when fully online.

IEA ‘World Energy Outlook 2025’ (Nov. 2025)

David Wojick
Reply to  Dave Andrews
January 27, 2026 12:00 pm

Data center growth is limited to a fraction, possibly large, of dispatchable generation growth. IEA may not have considered that. It takes years to design, finance, permit, procure and build a power plant.

Beta Blocker
Reply to  David Wojick
January 27, 2026 12:55 pm

It’s not just data centers. If lots of manufacturing is re-shored back to America, power consuming industries will increase the overall demand for electricity.

See my comment from two weeks ago, The Chip Plant that Ate New York

The Micron plant being built in Clay, New York,will draw its electricity from the New York power grid. Because the New York grid is attached to other regional grids, the plant will be consuming electricity generated by coal-fired, gas-fired, and nuclear power plants; plus whatever power wind and solar can manage to generate..

Micron has requested grid connections as early as 2025–2026, with an initial draw of up to 480 megawatts (MW) of continuous power to support early operations. By 2043, when all four planned fabs are complete, the complex is expected to consume 16,000 to 16,170 gigawatt-hours (GWh) annually.

At this scale, the facility will require roughly 1.85 gigawatts (GW) of continuous power—surpassing the annual electricity usage of Vermont and New Hampshire combined. IMHO, neither Micron nor the State of New York have a clue where all that power will be coming from. Not a clue.

2hotel9
January 27, 2026 9:13 am

Excellent explanation! Sending this to a pile of people.

David Wojick
Reply to  2hotel9
January 27, 2026 12:01 pm

Thank you. That is the hope.

Bar Code
January 27, 2026 9:32 am

I’m astounded that governments have not assessed special taxes on data centers.

Reply to  Bar Code
January 27, 2026 10:10 am

Well, there you go . . . you just blew it!

/sarc

January 27, 2026 10:05 am

Good explanation of the basics of Internet routing of info packets!

However these consecutive sentences:
“These routers repeat this forwarding process until all the packets reach the final destination. Here the computer reassembles the email and the document so it can be viewed.”
gloss over what I consider to be a critical factor in the total Internet protocols that make the system so “bomb proof”: the server/router that performs the initial message breakdown into packets will keep sending out repeat/duplicate packets until it receives confirmation from the intended destination server/router that the entire message (i.e., all packets) has been successfully transmitted and accurately reconstructed based on verification (error-code) algorithms.

David Wojick
Reply to  ToldYouSo
January 27, 2026 12:03 pm

Cool! I did not know that. So there is that much more activity to be housed. It is truly amazing.

Reply to  David Wojick
January 28, 2026 3:59 pm

Yep . . . truly amazing!

IMHO one of the great failings of the Nobel Prize selection committees in Sweden and Norway is they have not yet awarded a Nobel Prize in Physics or a Nobel Peace Prize to DARPA, the organization most behind developing the Internet (an advancement on their original DARPANET) as well as, perhaps, jointly to Vint Cerf and Bob Kahn who are credited with designing the Internet’s foundational TCP/IP protocols in the 1970s.

One can understand that the Nobel Prize committee would be reluctant to award a prize to a military organization, but it is INEXCUSABLE to not give such recognition to specific individuals that contributed so much to a technology that is used throughout the world to the direct benefit of the majority of humans on the planet.

sidabma
January 27, 2026 11:05 am

The physical AI Data Centers are nothing really to look at. They are a warehouse full of computer components. They appear to need to be vulnerable, and so are subject to high security. Because of all the heat that gets generated by the servers, outdoor cooling systems are running 24/7. Aparently they are quite noisy, and the neighbors don’t like living near these locations.
Because there have been no set rules on where they can be built, local governments give them tax breaks to move into their communities, and sometimes they end up near subdivisions. That does not always get accepted nicely.

This AI Data industry is not going to stop anytime soon. I think someone should be calling in these AI Group leaders and laying out some rules and regulations. America is a big country. AI Data parks similar to industrial parks, and supplying to them the services they require is one way to deal with it.

The other option is community based locations, somewhat away from the established subdivisions, with their own Community Power Plant from which they receive their electrical needs.Water requirements need to be discussed with the community they are moving into, getting supplied water or installing their own type of cooling system.
If they go the community route, it should also be manditory that their Community Power Plant operates as energy efficiently as possible. The combusted natural gas exhaust from the power plant can be used to create hundreds of more good paying full time jobs for the local community that is sponsering them to build and operate there.

How can we all get along as Americans. We work at it.

David Wojick
Reply to  sidabma
January 27, 2026 12:12 pm

A lot of this is being worked on at the State and County level. But the community power plant does not work because no plant gets a 100% capacity factor. Nukes maybe 90% but gas a lot less I think.

Reply to  sidabma
January 27, 2026 3:57 pm

Here’s part of the (largely unrecognized) problem:

If an upcoming AI data center is going to be “consuming” all the power output of a modern single nuclear reactor —as has been fronted as a reason for restarting the Unit 1 reactor at Three Mile Island in support of a single Microsoft data center—then the removal of the “waste heat” from the the data center will require about half the amount of cooling as that done for the single reactor in the power plant.

Reason that this is so: modern nuclear reactors using a steam turbine loop for electrical power generation have an overall thermal efficiency that is only about 33% . . .. So, unless the nuclear plant is located along the shoreline of an ocean or large river (or perhaps one of the Great Lakes in the US), the data center is going need an evaporative cooling tower that is about half the size of one put up at Three Mile Island (or, say, two each at one-quarter that size) . . . not very attractive in any community! See the attached photo.

TMI_Cooling_Towers
Beta Blocker
January 27, 2026 11:46 am

Advocates of the distributed computing model claim that computing architectures can be created which locate the AI processing CPU operations physically closer to the data retrieval operations, thus greatly reducing the number of I/O cycles needed — and therefore the electric power consumed — to get the data from where it resides to where it needs to be in order to be processed.

As I read their arguments, if you are a heavy-duty consumer of data in a corporate environment, you should not place your data into a cloud located inside someone else’s data center.

You should create a processing architecture inside your own organization which is located inside your own facilities, an architecture in which distributed AI processing works on data which is much closer physically in time and space to the CPU processing activity.

The distributed AI advocates are also heavy-duty critics of computing architectures which rely on SQL-based queries to initially retrieve data from SQL Server, Oracle, Sybase, and Informix databases. These advocates for distributed computing architectures view SQL databases as relics of a bygone computing era.

Do their arguments hold water? Can the advocates of distributed computing architectures make good on their promise to reduce the amount of electricity being consumed by data centers a hundred times and more? (Basically by doing away with data centers as we know them for large-scale business computing.) IMHO, the jury is still out on that question.

January 27, 2026 11:57 am

An intelligent person like David Wojick will know both the strengths and weaknesses of AI and be able to use it as a tool but the majority of people do not have the insight to recognize the shortcomings and when they are being led astray. I believe AI will also lead to a further dumbing down in education where a small minority will benefit enormously but the overwhelming majority become more ignorant and incapable of doing even simpler tasks without any aid.

Young children need to learn various skills like learning to read and write and do sums. Take the skill of cursive writing which many schools abandoned and now recognize as valuable both for being able to read cursive writing but also to develop their fine finger coordination. Take the skill of being able to read a paragraph and to concisely and accurately summarize the material. Take the skill of being able to read an article or book to sift it for the most important details and how to find and evaluate primary sources. A child’s mind needs to be stimulated and exercised, especially the most lazy. A child needs to learn how to work things out from first principles and grow into an adult who can adapt to all sorts of circumstances.

Bigger data centers and more access to the internet may be too costly and not provide the benefits that AI proponents are promising especially for poorer countries. Manufacture comes from the Latin words hand and to make, i.e. make by hand. If largescale automated manufacture takes away the jobs of large numbers who have made things by hand will they be better off? I certainly do not see them learning to program and producing the software for AI. I can see AI having its niche but am skeptical about it being cost effective, practical and reliable for every human activity.

David Wojick
Reply to  Michael in Dublin
January 27, 2026 12:52 pm

The projected job losses are from fanciful robotics not AI, so a very different issue. I do not see the revolutionary “read and reason about it” AI systems contributing much to robotics. This is another huge confusion.

Reply to  David Wojick
January 28, 2026 6:25 am

It is related. Only robots powered by AI are able to really replace humans in manual work. Field tending, cooking, etc.

Bob
January 27, 2026 1:53 pm

This makes sense David. Here is the problem in my view. Few people are afraid of big buildings with lots of electronic stuff in them. That is not the problem. Their energy consumption could be a problem but not one that can’t be solved. The problem is how AI is used. It is already being abused in that we are hounded 24/7 that AI is the answer to our problems. It is not a question of whether or not AI has an answer the problem is can we trust the answer? I look up a lot of stuff on the internet, I don’t automatically trust any of it. Ninety percent of the time you get a canned answer of the generally accepted consensus. That is a bad thing. In this regard I see AI as the internet on steroids. Most things I look up I have to go three or four pages in before I get anything other than the canned consensus. I don’t like that.

January 27, 2026 3:54 pm

Yes, power consumption and the waste heat are the major identified problems with AI data centers. But where one sees something as a problem, others see it as an opportunity. Research is being done by a number of companies to mitigate those problems.

One example is a company called Lightwave Logic. The odds of success are very low. I do not own shares in it, and do not recommend buying any – you have a much better chance with a roulette wheel betting on black. It’s just fascinating to see innovative technology. Lightwave Logic, Inc. is a development stage company which engages in the commercialization of electro-optic photonic devices. The firm offers the P2ICTM technology platform which uses in-house proprietary organic polymers. Its products include electro-optical modulation devices and proprietary polymer photonic integrated circuits. The polymer reduces the energy required to move data bits on chips by roughly 2/3. The tremendous reduction in power consumption naturally reduces waste heat and reduces the requirements for cooling.

I have seen a ‘gift from nature’ before (erbium-doped fiber amplifiers, e.g.), so while I don’t expect success in these ventures, I never completely discount the possibility.

Reply to  jtom
January 28, 2026 4:24 pm

“The polymer reduces the energy required to move data bits on chips by roughly 2/3.”

Hmmmm . . . let’s think about that for just one second . . . OK, the best modern computer chips currently have photo-etched circuits on silicon wafers that have about 25 nanometer spacing (https://www.directive.com/blog/nanotechnology-holds-the-key-to-doubling-computing-power.html )

So, how, exactly, is one going to apply polymers accurately to tens-of-nanometer precision on IC microchips, as well as have them be stable at IC operating temperatures up to, perhaps, 200 deg-F in an AI circuit?

Might as well be talking about unicorns.

January 27, 2026 5:25 pm

It was on the IP Sharp APL global ( even reached oil platforms ) packet switched email system in 1980 that I contracted Coherent Systems to CoSy because the system only took 4 byte names .

Arthur Jackson
January 30, 2026 12:10 am

“Data centers are primarily the home of the physical internet” No, the data centers are in fact data centers. They are not Internet infrastructure. The Internet can run perfectly well without any data centers at all, it’s a network.

The cloud(s) is made up of data centers. Everything you do over the Internet from navigation, taking pictures, your x-rays, sports, movies, making phone calls, your doorbell, emails, texts, buying hamburgers is run through telecommunications infrastructure (the Internet) and ultimately stored in the cloud (data centers) and the data is sold to businesses and governments. The cloud is nothing but a room full of electronic storage connected by communications switches and routers (the Internet) to other geographically separated rooms full of electronic storage. These rooms full of storage are the data centers.

Data centers also host applications and processing services for finance, bitcoins, games, engineering, online gambling, you name it. Some apps are supposedly AI but mostly are just friendly sounding code. The Internet facilitates communications between end users and the data centers.

These data centers create an enormous amount of global warming heat. Many company’s data centers have their own power plants and are not depending on intermittent providers like wind and solar, they are going nuclear. There is no CO2 only tons of heat. How do you warm the earth? With data!!!