
Your Recycling Is a Lie — And the Fix Requires Physics, Not ChatGPT
I watched a perfectly good polypropylene tray — the kind you'd buy sushi in — sail off the end of a conveyor belt and drop into a bin marked "residue." Residue is the polite word. It means landfill. It means incineration. It means failure.
The tray was black. That was its only crime.
I was standing in a Materials Recovery Facility in Europe, the kind of place that processes tens of thousands of tons of waste a year, and I was watching their state-of-the-art optical sorter — a machine that costs more than most apartments — systematically ignore every dark-colored object that passed beneath its sensors. Not because the machine was broken. Because the physics of its sensor made black plastic literally invisible.
That moment changed the trajectory of my company. At Veriprajna, we build deep AI systems for industrial problems, and I'd come to this facility expecting to find a software problem. A classification gap. Something we could tune. Instead, I found a hole in the electromagnetic spectrum — and no amount of machine learning could fill it.
The Scale of What We're Throwing Away
Here's a number that should bother you: of the 353 million tonnes of plastic waste generated globally each year, only 9% gets recycled. Half goes to landfill. A fifth is incinerated. The rest is mismanaged — a euphemism for "dumped somewhere we'd rather not think about."
Black plastics make this picture worse. They constitute between 3% and 15% of the total plastic waste stream depending on where you are. In a facility processing 50,000 tons a year, that's thousands of tons of material — polypropylene, polyethylene, ABS, polystyrene — that gets ejected from the recycling stream not because it can't be recycled, but because the machines can't see it.
And this material isn't worthless. Recycled black polypropylene trades at $1,130–$1,200 per ton. Recycled ABS fetches $800–$1,100. A single mid-sized facility is throwing away over $2 million in recoverable value every year. That's not a rounding error. That's a business model waiting to be unlocked.
You can't recycle what you can't see. And right now, the entire industry is blind to 15% of the waste stream.
Why Is a Black Tray Invisible to a Recycling Robot?
The answer lives in a pigment called carbon black. It's produced from the incomplete combustion of petroleum, and it's the reason most black plastics are black. It's also one of the most effective light absorbers ever manufactured.
Standard recycling sorters use Near-Infrared spectroscopy — NIR — operating between 0.9 and 1.7 micrometers. The way it works is elegant: halogen lamps flood the conveyor belt with light. When that light hits a colored or clear plastic bottle, it bounces back with specific wavelengths absorbed — a spectral fingerprint that tells the sensor "this is PET" or "this is HDPE." The pneumatic ejector fires. The bottle lands in the right bin.
But when that same light hits carbon black, it doesn't bounce back. The pigment absorbs the photons across the entire NIR range and converts them to heat. The sensor receives nothing. And because the conveyor belt itself is typically black rubber, the machine sees a black object on a black background returning zero signal. To the sorting algorithm, the belt looks empty.
I remember explaining this to an investor early on. He said, "Can't you just train a better model on the dark pixels?" I pulled up a spectral readout from a black PP tray under NIR. It was a flat line. Noise. I told him: there is no data here. You cannot train a model on nothing.
He paused, then said, "What about GPT?"
I get this question more than I'd like to admit.
Why Can't You Just Use an LLM for This?
I want to be direct about something, because the current AI hype cycle has created a dangerous illusion: you cannot prompt your way out of a physics problem.
Large Language Models are probabilistic text engines. They predict the next token based on patterns in their training data. They're extraordinary at what they do. But they require input. In the case of black plastic sorting, the input from a standard NIR sensor is a null set — a flatline of noise indistinguishable from the conveyor belt background.
If you forced a generative model to classify that noise, it might guess. It might say "probably polypropylene" because PP is common. But guessing isn't sensing. In an industrial recycling line where contamination above 1–2% renders an entire bale unsellable, a confident guess is worse than no answer at all. It's a hallucination with physical consequences.
There's also the latency problem. Industrial sorting decisions happen in milliseconds — a conveyor running at 3 meters per second doesn't wait for an API call to a cloud server. By the time a cloud-based model returns its confident wrong answer, the tray is already in the residue bin.
An LLM wrapper cannot hallucinate photons that were never captured by the sensor. If the data doesn't exist, the model is blind — no matter how many parameters it has.
This is the distinction I keep coming back to between what I call "AI wrappers" and deep tech. A wrapper takes someone else's model and puts a user interface on it. Deep tech changes the physics of the measurement. We needed to change the measurement.
What Happens When You Shift the Wavelength?

Carbon black's absorption isn't infinite. It has limits. And those limits become exploitable when you move from the near-infrared into the Mid-Wave Infrared — the MWIR band, specifically between 2.7 and 5.3 micrometers.
This is where polymer chemistry gets loud.
In the NIR range, you're picking up "overtone" vibrations — faint echoes of molecular bonds. They're subtle, easily drowned out by carbon black. But in the MWIR, you hit the fundamental vibrations: the C-H stretching bonds, the C=O carbonyl stretches, the aromatic ring modes. These signals are orders of magnitude stronger. Strong enough to punch through the carbon black pigment and reach the sensor.
The first time my team saw a clean spectral readout from a black polypropylene tray under MWIR, there was a moment of genuine disbelief. We'd been staring at flatlines for weeks. And then suddenly — peaks. Sharp, distinct, unmistakable. The 3.4 micrometer C-H absorption band was right there, as clear as any textbook diagram. Except this wasn't a textbook sample. It was a crushed, dirty food tray pulled from a real waste stream.
I turned to my engineer and said, "The tray was always talking. We were just listening on the wrong frequency."
That's the core insight. We didn't make the plastic more visible. We changed where we looked.
How Does MWIR Hyperspectral Imaging Actually Work?
We built our system around the Specim FX50, which is currently the only commercially viable hyperspectral camera covering the full 2.7–5.3 micrometer range needed for this application. And "commercially viable" is doing a lot of heavy lifting in that sentence, because this is not a webcam you bolt onto a conveyor.
The detector material is Indium Antimonide — an exotic semiconductor sensitive to thermal radiation. Because you're essentially detecting heat signatures at these wavelengths, the sensor has to be cooled to cryogenic temperatures — about 77 Kelvin, or roughly minus 196 degrees Celsius — using an integrated Stirling cooler. If you don't cool it, the sensor blinds itself with its own thermal noise.
The camera captures 154 spectral bands for every pixel in its field of view, generating a three-dimensional data cube: spatial position plus wavelength. At 380 frames per second, it keeps pace with conveyor belts running above 2 meters per second.
I wrote about the full sensor architecture and the physics behind it in our interactive whitepaper — the engineering details of cryogenic cooling alone could fill their own essay. But the key point is this: what the camera sees isn't color. It sees chemistry. A black PP tray and a black PS lid look identical to your eyes. Under MWIR, they have completely different spectral signatures — different peaks, different absorption patterns, different molecular identities.
We stopped doing computer vision and started doing chemical vision. The camera doesn't see "black shapes." It sees a stream of molecular fingerprints.
The AI That Reads Chemistry, Not Pictures
Capturing 154-band hyperspectral data at industrial speed generates an enormous volume of information. The question becomes: how do you classify it fast enough to trigger an air jet before the object falls off the belt?
The standard instinct in AI is to reach for a 2D Convolutional Neural Network — the kind that powers image recognition. ResNet, YOLO, the architectures that can tell a cat from a dog. But waste sorting breaks every assumption those networks rely on. A crushed bottle doesn't look like a bottle. A torn tray fragment has no recognizable shape. A shard of black automotive plastic is spatially identical to a shard of black food packaging.
Shape is unreliable. Chemistry isn't.
So we treat the problem as signal processing, not image recognition. For every pixel on the conveyor belt, we extract a one-dimensional vector of 154 values — the spectrum at that point. We feed that vector into a 1D-Convolutional Neural Network.
Instead of square kernels sliding over an image looking for edges and textures, our linear kernels slide over the spectrum looking for molecular signatures: a sharp drop at 3.4 micrometers, a broad shoulder at 4.0, a specific doublet peak that says "this is polystyrene, not polyethylene." The network learns the grammar of chemical bonds.
There was a week where one of my engineers argued we should try a Transformer architecture instead — attention mechanisms, the same approach powering GPT. On paper it made sense. In practice, the quadratic computational complexity made inference too slow for a belt moving at 3 meters per second. Our 1D-CNN runs in under 5 milliseconds on edge hardware. The Transformer was still "attending" to the global context of the spectrum while our system had already classified the pixel and fired the ejector.
We don't run in the cloud. There's an NVIDIA Jetson AGX Orin sitting on the sorting machine. The data never leaves the facility. By the time a cloud-based system would have finished its round-trip, our air jet has already redirected the tray into the correct bin.
Fusing Two Ways of Seeing

MWIR tells you what something is. But it has lower spatial resolution than a standard camera and it's expensive. So we fuse it with RGB.
A high-resolution color camera handles segmentation — finding the boundaries of objects on the belt. It creates a mask: "there's an item at these coordinates." The MWIR camera captures the spectral data. Our fusion engine overlays the RGB mask onto the MWIR data cube and queries the spectrum inside each object boundary. The 1D-CNN classifies the material.
The output to the sorting robot is a composite data packet: Object #452 is black polypropylene, located at these coordinates, oriented at this angle. Pick it up. Put it in bin three.
This hybrid approach lets us use cheap, fast RGB for the spatial work and reserve the expensive, information-dense MWIR for the decision that matters: what is this thing made of?
Why Doesn't the Industry Already Do This?
People ask me this constantly. If MWIR works, why isn't every recycling plant using it?
Three reasons.
First, the hardware barrier. Cryogenically cooled infrared cameras with exotic semiconductor detectors are not commodity items. You can't order one from a consumer electronics catalog. The Specim FX50 exists, but integrating it into a sorting line that handles real-world waste — dirty, wet, overlapping objects at speed — requires significant engineering.
Second, the AI barrier. Standard sorting machine firmware is designed for NIR data. You can't just swap the sensor and expect the existing software to work. The 1D-CNN architecture, the spectral preprocessing, the sensor fusion pipeline — all of this is custom. This is where Veriprajna lives. We provide the intelligence layer for hardware that was built for a different era of sensing.
Third, inertia. For years, the industry's answer to black plastic was "don't use it" or "accept the loss." Brands were told to switch to detectable pigments. Some did. Most didn't, because carbon black is cheap, UV-stable, and lets manufacturers use mixed-color recycled feedstock — the very thing that makes recycling economically viable in the first place.
Carbon black lets manufacturers use recycled content. But it also makes the final product invisible to recycling sensors. The pigment that enables circularity simultaneously destroys it.
The EU's Packaging and Packaging Waste Regulation is forcing the issue. By 2030, all packaging must be recyclable — not theoretically, but provably, in actual industrial facilities. If the sorter can't see it, it's legally unrecyclable. That regulatory deadline is concentrating minds.
The Economics That Make This Inevitable

I've learned that when you're selling deep tech to industrial operators, the environmental argument opens the door but the spreadsheet closes the deal.
Consider a mid-sized European MRF processing 50,000 tons per year. Black plastic content: 5%, or 2,500 tons. Currently, that material goes to incineration at a gate fee plus carbon tax of roughly €100 per ton — a cost of €250,000 annually just to destroy valuable material.
With MWIR sorting recovering 90% of that stream and selling the sorted pellets at €900 per ton, the math shifts dramatically: €2.25 million in combined revenue and avoided disposal costs. Against a system capital expenditure of around $300,000, the payback period is less than two months.
I've watched facility managers do this calculation on the back of an envelope and then immediately ask when we can install. The economics aren't marginal. They're overwhelming.
For the full technical breakdown — including the spectral differentiation data, the 1D-CNN architecture details, and the sensor fusion pipeline — I've published a detailed research paper that goes deeper than I can in an essay.
What This Is Really About
I started Veriprajna because I believed the hardest industrial problems couldn't be solved by wrapping an API. They require understanding the physics of the measurement, building the right sensor pipeline, and designing AI architectures that match the structure of the data — not the structure of the hype cycle.
Black plastic recycling is a case study in why deep tech matters. The problem was never that we lacked intelligence. The problem was that we lacked signal. We were shining the wrong light and then blaming the AI for not seeing anything.
When someone tells you AI can solve everything, ask them: solve it with what data? If the sensor can't capture the reality, the model is just a very expensive random number generator.
There are millions of tons of perfectly recyclable polymer sitting in landfills right now because of a pigment that absorbs near-infrared light. Not because the chemistry is wrong. Not because the economics don't work. Because the sensor was built for a world where everything is a convenient shade of blue or green.
The world isn't that convenient. And the solution isn't a better prompt. It's a better photon.


