Showing posts with label breakthrough. Show all posts
Showing posts with label breakthrough. Show all posts

China’s Fusion Reactor Does the Impossible

China’s Fusion Reactor Does the Impossible

China’s Experimental Advanced Superconducting Tokamak (EAST) has shattered a long-standing fusion barrier by achieving plasma densities far beyond traditional limits, entering a “density-free regime” once thought impossible.

What Happened

  • Reactor involved: EAST, often called China’s “artificial sun.”
  • Breakthrough: Plasma density was pushed well beyond the empirical “Greenwald limit.”
  • Key achievement: Plasma remained stable at extreme densities.
  • Publication: Results were published in Science Advances on January 1, 2026.

Why It Matters

  • Fusion ignition closer: Higher plasma density means more frequent fusion reactions.
  • Efficiency boost: Surpassing density limits could allow future reactors to generate more power.
  • Global impact: Removes one of the most persistent obstacles in fusion research.

How They Did It

  • Novel operating scheme: EAST used a high-density operating approach.
  • Density-free regime: This state had been theorized but never experimentally accessed until now.
  • Collaborators: Led by Prof. Ping Zhu and Associate Prof. Ning Yan.

Comparison: Traditional vs. Breakthrough Plasma Density

Aspect Traditional Tokamaks EAST Breakthrough
Plasma density limit Greenwald limit (instability beyond) Surpassed without collapse
Stability Instabilities trigger shutdown Stable at extreme densities
Energy potential Limited by density cap Higher fusion reaction rates
Research status Theoretical predictions only Experimentally confirmed

Challenges Ahead

  • Scaling up: Replicating in larger reactors like ITER will require validation.
  • Engineering hurdles: Maintaining stability at high density over long durations is unresolved.
  • Commercialization timeline: Fusion power plants remain years—possibly decades—away.

A Single Pill to Stop Many Viruses? Researchers Say It’s Possible

A Single Pill to Stop Many Viruses? Researchers Say It’s Possible

Scientists have identified a potential pathway to a universal antiviral drug by targeting common structures on viruses, offering hope for broad-spectrum protection against future pandemics.

Adam Braunschweig, Professor of Chemistry at Hunter College, CUNY (New York) and his team at the Nanoscience Initiative, Advanced Science Research Center (CUNY Graduate Center) discovered compounds that block infections from multiple viruses.

What the Breakthrough Is About

  • Targeting sugars on viral surfaces: Researchers discovered that many viruses share similar carbohydrate structures on their outer shells. By designing small molecules that bind to these sugars, they were able to block infections across multiple virus families.
  • RNA-protein interactions: Another team uncovered how enteroviruses replicate using a structured RNA element. This insight could lead to drugs that disrupt viral replication at a fundamental level.
  • Broad-spectrum potential: Unlike current antivirals that are virus-specific (e.g., HIV or influenza drugs), this approach aims to work against many different viruses at once, including those we haven’t encountered yet.

Why This Matters

  • Pandemic preparedness: Right now, when a new virus emerges, scientists scramble to develop vaccines or treatments. A universal antiviral could serve as an immediate first line of defense.
  • Comparison to antibiotics: Just as broad-spectrum antibiotics revolutionized bacterial infection treatment, a universal antiviral could transform how we fight viral diseases.
  • Versatility: The compounds tested so far blocked infections from at least seven different viruses, showing promise for wide applicability.

Challenges Ahead

  • Safety & toxicity: Any drug that broadly targets viral structures must be proven safe for human cells.
  • Resistance risk: Viruses evolve quickly, so researchers must ensure these antivirals don’t lose effectiveness over time.
  • Clinical trials: The breakthrough is still in the lab stage. It will take years of testing before such drugs could be approved for human use.

Quick Comparison

Feature Current Antivirals (e.g., HIV, flu) Universal Antiviral (in research)
Target Specific viral proteins Shared sugars / RNA structures
Scope One virus family Multiple virus families
Availability Approved & in use Still experimental
Pandemic readiness Slow response (new drug/vaccine needed) Immediate broad-spectrum defense

This is a huge conceptual leap: instead of chasing each virus individually, scientists are trying to hit the common weak spots that all viruses share. If successful, it could be one of the most important medical advances of the century.

Figure 03: The Next Leap in Humanoid Robotics

Figure 03: The Next Leap in Humanoid Robotics


Figure 03 is the latest humanoid robot developed by Figure AI, a California-based robotics startup founded in 2022 by Brett Adcock. Designed for real-world environments—from homes to warehouses—Figure 03 is a full-sized, bipedal robot capable of performing everyday tasks like folding laundry, washing dishes, and navigating complex spaces.

About Figure AI

Figure AI is backed by major tech investors including OpenAI, Microsoft, and Jeff Bezos. Its mission is to create scalable, general-purpose humanoid robots that can fill labor gaps and assist in daily life.
  • Founded in 2022 by Brett Adcock
  • Headquartered in California
  • Focused on real-world deployment of humanoid robots
  • Backed by OpenAI, Microsoft, and Jeff Bezos

Architecture Highlights

At the heart of Figure 03 is Helix, a proprietary Vision-Language-Action (VLA) model that enables real-time decision-making without pre-scripted commands.
  • Vision: Wide field of view and high frame rate
  • Language: Natural language comprehension and contextual reasoning
  • Action: Dexterous hands and stable locomotion for task execution
  • Unified System: Integrates perception, planning, and control
Helix makes Figure 03 one of the first humanoids capable of unscripted, adaptive behavior in dynamic environments.

What Makes Figure 03 Revolutionary?

Figure 03 isn’t just a robot with limbs—it’s a general-purpose AI agent powered by Helix, a proprietary vision-language-action (VLA) model that allows it to understand, reason, and act in real time. It doesn’t follow scripts. It learns, adapts, and responds to the world like a human would.
  • Vision Upgrade: 60% wider field of view and 2× faster frame rate
  • Latency: Reduced by 75% for near-instant reactions
  • Mobility: Walks, balances, and navigates complex environments
  • Hands: Redesigned for dexterity—capable of folding laundry, pouring drinks, and more
Watch the full reveal in Introducing Figure 03, where the team showcases the robot’s capabilities and design philosophy.

Built for the Real World

Unlike its industrial predecessors, Figure 03 is engineered for human-centric environments. It’s 5'6" tall, weighs 60 kg, and is designed to operate safely around people.
  • Home Integration: Wireless charging, soft-touch materials, and voice interaction
  • Safety First: Advanced sensors and AI-driven motion planning
  • Household Tasks: From dishwashing to plant watering, it’s your new domestic ally
Curious how it performs in a real home? Is the Figure 03 Robot Ready to Clean Your House? puts it to the test in a domestic setting.

Beyond the Home: A Workforce Revolution

Figure AI envisions a future where humanoid robots fill labor gaps in logistics, manufacturing, and elder care. With a modular design and scalable production, Figure 03 is built for mass deployment.
  • Production-Ready: Designed for large-scale manufacturing
  • Cost Target: Estimated between $20,000–$30,000 per unit
  • Job Impact: Analysts predict up to 8 million jobs could be augmented or replaced by 2040

Explore the broader implications in Figure 03 – The Humanoid Robot Built for the Home and the ...

Watch Figure 03 in Action

🧩 The Bigger Picture

Figure 03 is more than a machine—it’s a symbol of the AI-human future. With backing from OpenAI, Microsoft, and Jeff Bezos, Figure AI is betting big on a world where robots are not just tools, but teammates.

As the humanoid revolution accelerates, one thing is clear: the future isn’t just coming—it’s walking right through your front door.

HSBC’s Quantum Breakthrough Could Reshape Wall Street

HSBC’s Quantum Breakthrough Could Reshape Wall Street

In a landmark moment for financial technology, HSBC has unveiled results from a quantum computing trial that could redefine how Wall Street approaches bond trading. The bank’s experiment, conducted in partnership with IBM, demonstrated a 34% improvement in predicting bond trade execution—an edge that could translate into billions in competitive advantage.

Quantum Meets Wall Street

Using IBM’s Heron quantum processor, HSBC ran simulations on anonymized, production-scale European corporate bond data. Unlike previous quantum trials that relied on synthetic datasets or theoretical models, HSBC’s test was grounded in real-world trading conditions. The result: quantum algorithms outperformed classical methods in forecasting whether a bond would trade at its quoted price.


HSBC’s Quantum Breakthrough Could Reshape Wall Street

This is our Sputnik moment, said Philip Intallura, HSBC’s global head of quantum technologies. It’s the first time quantum computing has shown tangible value in live financial markets.

Why It Matters

Bond trading, especially in less liquid markets, hinges on predicting execution probability. A 34% boost in accuracy means traders can quote more confidently, manage risk better, and potentially unlock new revenue streams. For Wall Street firms competing on milliseconds and margins, quantum’s predictive power could be transformative.

The Quantum Arms Race

HSBC’s Quantum Breakthrough Could Reshape Wall Street

HSBC’s breakthrough adds fuel to a growing quantum race among global banks. JPMorgan Chase, Goldman Sachs, and Citigroup have all invested in quantum research, but HSBC’s use of real trading data sets a new benchmark. The trial also signals a shift from theoretical promise to practical deployment.

According to McKinsey, quantum computing could generate $72 billion in annual revenue by 2035, up from $4 billion last year. Financial services are expected to be among the earliest beneficiaries, especially in areas like portfolio optimization, risk modeling, and fraud detection.

What’s Next

While quantum computers remain in their infancy, HSBC’s trial proves that even today’s noisy intermediate-scale quantum (NISQ) devices can deliver meaningful results. As hardware improves and algorithms mature, quantum could become a core pillar of financial infrastructure.

For now, HSBC’s experiment is a wake-up call: the quantum future isn’t decades away—it’s already reshaping the foundations of Wall Street.

AI Designs Viruses That Kill Bacteria—A New Frontier in Synthetic Biology

AI Designs Viruses That Kill Bacteria—A New Frontier in Synthetic Biology

In a stunning leap for synthetic biology, scientists have used artificial intelligence to design viruses that can infect and kill bacteria—ushering in a new era of programmable life forms and potentially revolutionizing medicine.

Researchers at Stanford University and the Arc Institute trained an AI model named Evo on over 2 million bacteriophage genomes. The goal? To teach the system how nature builds viruses that target bacteria. Evo didn’t just remix existing genetic material—it generated 302 entirely new viral genomes, many of which had never existed in nature.

Of those, 16 assembled into fully functional viruses that successfully infected and destroyed E. coli bacteria in lab tests. This marks the first time AI has been used to design complete, working viruses from scratch.
“We’re not just accelerating evolution—we’re directing it,” said one of the lead researchers. “This opens the door to custom-built phages that could target antibiotic-resistant bacteria with surgical precision.”

Why This Matters

  • Antibiotic resistance is one of the biggest threats to global health, with superbugs killing over a million people annually.
  • Phage therapy, which uses viruses to kill bacteria, has long been seen as a promising alternative—but finding the right phage is slow and unpredictable.
  • AI could dramatically speed up the discovery and design of targeted phages, potentially enabling personalized treatments for infections.

The Ethical Frontier

While the study focused solely on bacteriophages and excluded viruses that infect humans, the implications are profound. Experts warn that AI-designed viruses could behave unpredictably in complex ecosystems. There are also concerns about biosecurity and the potential misuse of such technology.
“We need robust oversight and ethical frameworks,” said a bioethicist not involved in the study. “This is powerful tech, and with great power comes great responsibility.”

What’s Next?

  • The team plans to expand Evo’s capabilities to design phages for other bacterial strains, including those responsible for hospital-acquired infections.
  • There’s growing interest in using AI to design viruses for agriculture, microbiome engineering, and environmental cleanup.
This breakthrough isn’t just about killing bacteria—it’s about reimagining what life can be. With AI as a co-creator, biology may no longer be bound by the slow march of evolution. It’s entering the age of intelligent design.

World's 1st FDA Approved Bioelectronic Implant for Arthritis

World's 1st FDA Approved Bioelectronic Implant for Arthritis

In a groundbreaking move, the U.S. FDA has approved the SetPoint System, the first neuroimmune modulation implant designed to treat moderate to severe rheumatoid arthritis (RA)—especially for patients who haven’t responded well to traditional medications.

What It Is

  • A vitamin-sized neurostimulator implanted in the neck via a minimally invasive outpatient procedure.
  • Delivers 1-minute daily electrical pulses to the vagus nerve, which helps regulate inflammation and immune response.
  • Patients recharge it wirelessly once a week using a collar-like neckband.
World's 1st FDA Approved Bioelectronic Implant for Arthritis
World's 1st FDA Approved Bioelectronic Implant for Arthritis

Clinical Impact

Based on the RESET-RA trial (242 patients), the device showed:
  • Significant improvement in RA symptoms within 3 months.
  • Sustained benefits at 12 months.
  • 75% of participants were able to stop other RA medications entirely.
  • Reported serious adverse events were low (~1.7%).
World's 1st FDA Approved Bioelectronic Implant for Arthritis

Why It Matters

  • RA affects over 1.5 million Americans, causing chronic pain, joint damage, and disability.
  • Traditional treatments like DMARDs and biologics can be costly and lose effectiveness over time.
  • This implant offers a non-pharmaceutical, long-term solution—potentially lasting up to 10 years.

Bioelectronic Medicine Milestone

This marks a major leap in bioelectronic medicine, using the body’s own neural circuits to combat disease. SetPoint Medical plans a targeted U.S. rollout in late 2025, with broader availability expected in 2026.

The results of the RESET-RA study, which led to the FDA approval, are published in the journal Bioelectronic Medicine.

A Robot That Gives Birth? China’s Kaiwa Sparks Global Debate

A Robot That Gives Birth? China’s Kaiwa Sparks Global Debate

Kaiwa Technology, a Guangzhou, China-based firm led by Dr. Zhang Qifeng, has unveiled plans to launch the world’s first humanoid robot equipped with an artificial womb by 2026. The project was introduced at the 2025 World Robot Conference in Beijing and is being hailed as a potential revolution in reproductive science.

What Makes This Robot Unique?

  • Artificial Womb Integration: Carries a fetus from fertilization to full-term birth using synthetic amniotic fluid and nutrient-delivery tubes.
  • Interactive Pregnancy: Mimics the entire gestation process—from implantation to delivery.
  • Affordability: Target price under 100,000 yuan (~$13,900), far cheaper than traditional surrogacy.

Feature Overview

Feature Description
Artificial Amniotic Fluid Maintains fetal hydration and temperature
Nutrient Tubes Delivers oxygen and nutrients similar to an umbilical cord
Temperature Regulation Simulates womb-like warmth and stability
Oxygen Monitoring Ensures safe fetal development

Ethical & Legal Challenges

  • Surrogacy Ban: Surrogacy is illegal in China; Kaiwa is negotiating with Guangdong authorities.
  • Embryo Research Limits: Restricted to 14 days under current law.
  • Concerns:
    • Parental rights and legal guardianship
    • Psychological impact on robot-born children
    • Potential misuse for mass reproduction or genetic engineering

Societal Reactions

  • Some hail it as a lifeline for infertile couples.
  • Others call it “dehumanizing” or a “dystopian nightmare.”
  • Feminist critiques warn it could undermine maternal identity

Time Reversal: Scientists Achieves A Real-Life Sci-Fi Breakthrough

Time Reversal:  Scientists Achieves A Real-Life Sci-Fi Breakthrough

Austrian scientists have pulled off something that sounds straight out of science fiction: they’ve successfully reversed time for a single photon using a device called a quantum switch. This isn’t about building a time machine to visit the dinosaurs—it’s about manipulating the flow of time within quantum systems, where the rules of reality bend in mind-boggling ways.

What They Did

  • Researchers from the Austrian Academy of Sciences (ÖAW) and the University of Vienna used a photon (a particle of light) and sent it through a crystal.
  • With the help of the quantum switch, they were able to rewind the photon’s state—returning it to how it was before the journey began.
  • This process, called a rewind protocol, works even without knowing what happened to the particle during its journey—a feat previously thought impossible in quantum mechanics.

Quantum Switch
Image Credits – S. Kelley/JQI


Fast-Forwarding Time Too

  • The team didn’t stop at rewinding. They also discovered how to accelerate time for a quantum system.
  • By redistributing “evolutionary time” among identical systems, they made one system age 10 years in just one, while the others remained unchanged.

Why It Matters

  • While reversing time for humans is far beyond reach (it would take millions of years to rewind even one second of a person’s life), this discovery could revolutionize quantum computing.
  • It opens the door to undoing errors in quantum processors, making them more powerful and reliable.

A New Way to Watch Reality

Physicist Miguel Navascués likened classical physics to watching a movie in a theater—linear and unchangeable. Quantum physics, he said, is like watching at home with a remote: you can rewind, fast-forward, or skip scenes.

Mass Production of World's First Non-Binary AI Chip Marks a New Era in Computing

Mass Production of World's First Non-Binary AI Chip Marks a New Era in Computing

China has commenced mass production of the world’s first non-binary AI chip, a groundbreaking development that challenges traditional computing limitations. Developed by Professor Li Hongge’s team at Beihang University, this innovation integrates binary logic with stochastic computing, paving the way for energy-efficient, high-performance AI hardware.

What Is a Non-Binary Chip?

For decades, computers have operated on binary logic, where every calculation relies on sequences of 0s and 1s. While highly efficient, binary computing faces growing challenges in power consumption and adaptability. A non-binary chip introduces Hybrid Stochastic Numbers (HSN) —a fusion of traditional binary numbers with probability-based values. This means that, instead of solely relying on rigid binary operations, these chips leverage randomness to optimize calculations, enhancing efficiency and fault tolerance.

A Solution to Major Tech Roadblocks

This non-binary chip addresses two critical hurdles in computing:
  • The Power Wall: Traditional chips consume excessive energy, limiting scalability. Non-binary chips significantly reduce power consumption while maintaining speed.
  • The Architecture Wall: Many experimental non-silicon chips struggle to integrate with existing systems. This new technology seamlessly aligns with CMOS-based architectures, ensuring compatibility.

Real-World Applications and Strategic Advantages

China is deploying these chips across various industries, including aviation, industrial control systems, and intelligent displays, enabling real-time AI processing with superior efficiency.

Moreover, the chip’s domestic production circumvents U.S. semiconductor export restrictions, reinforcing China’s push for technological self-reliance. The U.S. has imposed strict export restrictions on Nvidia’s AI chips, including the H20 model, which was specifically designed to comply with earlier regulations but is now banned. With China developing its own advanced AI chips, it can bypass these restrictions and continue AI development without relying on U.S. technology.

What’s Next?

This breakthrough could reshape the future of AI hardware, creating faster, smarter, and more energy-efficient systems. As global competition in semiconductor technology intensifies, non-binary computing may soon become the new standard.

Could this revolutionize AI-powered industries? Comment below to have your opinion.... 

India’s Next-Gen Weather Model Uses Supercomputer for High-Resolution Forecasting Like Never Before

India’s Next-Gen Weather Model Uses Supercomputer for High-Resolution Forecasting Like Never Before

India has unveiled the Bharat Forecast System (BFS), the world's highest-resolution weather model, operating on a 6-kilometre grid. Developed by the Indian Institute of Tropical Meteorology (IITM), BFS aims to provide highly localized forecasts, improving predictions for disaster risk reduction, agriculture, and public safety.

The system is powered by Arka, a supercomputer with 11.77 petaflops of computational capacity and 33 petabytes of storage, significantly reducing forecast processing time from 10 hours to just 4 hours.

This development comes within a few months after Indian Space agency ISRO’s National Remote Sensing Centre (NRSC) made a major breakthrough in now-casting lightning events over India using data from geostationary satellites. This enhanced predictive accuracy with a 2.5-hour lead time.

BFS integrates data from 40 Doppler Weather Radars, which will expand to 100, improving real-time monitoring. These radars provide nowcasts—short-term forecasts for the next two hours, crucial for disaster preparedness.

India’s Next-Gen Weather Model Uses Supercomputer for High-Resolution Forecasting Like Never Before

Unlike traditional square grids, BFS uses a triangular cubic octahedral grid, which enhances spatial accuracy and ensures better data distribution. This allows BFS to predict extreme weather events with 30% more accuracy, including cyclones and heavy rainfall.

BFS leverages satellite imagery and AI-driven climate models to refine predictions.
AI helps in pattern recognition, improving forecasts for monsoons, heatwaves, and localized storms.

To recall, in October 2023 researchers fom the US had introduce the Recurrent Earthquake foreCAST (RECAST), a deep learning model for earthquake forecasting.

Compared to global models from the US, UK, and EU, which operate at 9–14 km resolution, BFS offers unmatched precision, covering the tropical region between 30° South and 30° North latitudes. BFS operates at 6-km resolution, surpassing global models from the US, UK, and EU. 

This breakthrough comes at a crucial time, as extreme weather events increasingly impact India's economy, particularly food inflation and crop damage.

BFS provides hyperlocal forecasts down to the panchayat level, making it one of the most precise weather models in the world.

This breakthrough positions India as a global leader in meteorology, enhancing agriculture, disaster management, and climate resilience.

A major leap for India's meteorological capabilities! What do you think—could this reshape climate resilience strategies? Do comment your opinion...

Disclaimer : All images are representational

China Achieves Historic First: Refueling a Running Nuclear Reactor

China Achieves Historic First: Refueling a Running Nuclear Reactor
  • China Revives Abandoned U.S. Nuclear Tech to Achieve Energy Breakthrough
  • China now has world's first operational thorium nuclear reactor
Chinese scientists have achieved a major breakthrough in nuclear energy by reviving old research from the United States. They built a unique reactor in the Gobi Desert that runs on thorium, a different and safer fuel compared to uranium.

Unlike traditional reactors, the Chinese scientists built one that produces less nuclear waste. The most impressive part of their achievement is that they managed to refill the reactor while it was still running, something no one had done before.

Chinese scientists have successfully refueled an experimental thorium molten salt reactor without shutting it down—an unprecedented breakthrough in nuclear energy.

This technology was originally developed in the U.S. in the 1950s, but it was abandoned, leaving the research publicly available. China picked up where the U.S. left off and successfully made it work. If this innovation can be scaled up, it could lead to cleaner and safer nuclear power, helping the world transition to better energy solutions with less pollution. This marks a significant step toward sustainable energy and reducing carbon emissions.

As mentioned above, thorium reactors were originally developed in the United States in the 1950s, but the U.S. shifted focus to uranium-based reactors, leaving this research publicly available. Chinese scientists capitalized on this abandoned knowledge, refining it into a working prototype.

Comic Timing

The timing of China’s nuclear breakthrough is almost poetic, given the ongoing tariff war with the U.S. Right now, Washington and Beijing are locked in a tense trade battle, with the U.S. imposing up to 145% tariffs on Chinese goods, while China retaliates with 125% tariffs on American imports.

Against this backdrop, China’s successful revival of abandoned U.S. nuclear research feels like a strategic flex. It’s as if Beijing is saying, “You may have left this behind, but we’ve turned it into a game-changer.” The fact that the U.S. originally developed thorium reactor technology in the 1950s, only to abandon it, makes this moment even more ironic.

While trade tensions escalate, China is making strides in energy independence, potentially reducing reliance on foreign fuel sources. If thorium reactors prove viable on a large scale, China could strengthen its energy security, making it less vulnerable to external pressures—including economic sanctions.

It’s an interesting mix of scientific progress and geopolitical maneuvering.

Nuclear Technology

This reactor can generate 2 megawatts (MW) of energy, enough to power around 2,000 households, and it significantly reduces nuclear waste compared to conventional uranium reactors. Given China’s goal of carbon neutrality by 2060, this breakthrough could play a crucial role in its clean energy transition.

China’s breakthrough in nuclear energy revolves around a thorium molten salt reactor (TMSR), a next-generation nuclear system that operates differently from traditional uranium-based reactors. Here are the key technical details:
  • Fuel Source: Instead of solid uranium rods, this reactor uses liquid thorium dissolved in molten salt.
  • Refueling Innovation: Scientists successfully refueled the reactor while it was still running, a feat never achieved before.
  • Safety Features: The molten salt system prevents overheating, making meltdowns nearly impossible.
  • Efficiency: Thorium reactors extract more energy per unit of fuel compared to uranium reactors.
  • Waste Reduction: Produces minimal long-lived radioactive waste, unlike conventional nuclear reactors.
  • Self-Regulating Mechanism: If the reactor overheats, the molten salt expands, automatically reducing nuclear reactions.
  • Emergency Shutdown System: A freeze plug at the reactor’s base melts in emergencies, draining the fuel into a safe storage chamber to stop reactions instantly.
  • Power Output: The experimental reactor generates 2 megawatts (MW) of thermal power, enough to supply around 2,000 households
This breakthrough could redefine nuclear energy by making it safer, cleaner, and more sustainable. What’s your take on this? Comment below....

Breakthrough: Newly Discovered Algae Might Cut Fertilizer Use Forever

Breakthrough: Newly Discovered Algae Might Cut Fertilizer Use Forever

Imagine plants are like hungry children who need a special ingredient—nitrogen—to grow strong. Normally, they get it from fertilizers, which farmers add to the soil. But scientists just found a tiny algae in the ocean that can make its own nitrogen, without needing fertilizer!

This algae has a tiny helper inside it called a nitroplast—kind of like a built-in kitchen where it cooks up nitrogen from the air. This is a big deal because, until now, scientists thought only bacteria could do this trick.

Why does this matter? Well, if we can use this discovery to engineer crops that do the same, farmers may no longer need to use as many fertilizers. That means cheaper farming, less pollution, and healthier soils—all thanks to this little ocean algae working its magic.

To an uninitiated, the Nitrogen fixation is the process of converting atmospheric nitrogen (N₂) into a biologically usable form, such as ammonia (NH₃). Even though nitrogen makes up about 78% of Earth's atmosphere, plants can't absorb it directly. Instead, they rely on nitrogen-fixing organisms—like bacteria in soil or, as recent research suggests, certain algae—to transform nitrogen into compounds they can use for growth.

This process is crucial because nitrogen is a key nutrient for plants, directly influencing their ability to produce proteins, enzymes, and chlorophyll. Without nitrogen fixation, ecosystems would struggle to sustain life, and farmers would be far more dependent on synthetic fertilizers, which can contribute to environmental issues like soil degradation and water pollution.

In a latest this month, researchers have discovered a marine alga, Braarudosphaera bigelowii, that can fix nitrogen thanks to a newly identified organelle called a nitroplast. This is groundbreaking because, until now, nitrogen fixation was thought to be exclusive to bacteria and archaea.

 
Braarudosphaera
Braarudosphaera

The nitroplast evolved from a symbiotic bacterium that started living inside the alga about 100 million years ago. Over time, it became an integral part of the algal cell, allowing it to convert atmospheric nitrogen into ammonia-a process crucial for plant growth.

Over time, this bacterium became an integral part of the algal cell, functioning as an organelle rather than a separate organism. The nitroplast converts atmospheric nitrogen (N₂) into ammonia (NH₃), a crucial nutrient for plant growth.

How It Was Identified

Researchers used soft X-ray tomography to observe the nitroplast’s behavior during cell division, confirming that it divides in sync with other organelles.

Genetic analysis revealed that the nitroplast relies on proteins from the host alga, further supporting its classification as an organelle rather than a free-living symbiont.

This discovery could pave the way for genetically engineered crops that require little to no fertilizer, reducing environmental impact and agricultural costs.

The discovery of nitrogen-fixing algae opens up exciting possibilities—potentially reducing the need for artificial fertilizers while promoting sustainable agriculture.

Oxford Scientists Claim to Have Achieved Teleportation Using a Quantum Supercomputer

Scientists at the University of Oxford have successfully demonstrated quantum teleportation using a scalable quantum supercomputer. This breakthrough involves Tele porting logical gates (the fundamental components of quantum algorithms) across a network link, rather than just transferring quantum states.
 
Oxford Scientists Claim to Have Achieved Teleportation Using a Quantum Supercomputer
Dougal Main and Beth Nichol working on the distributed quantum computer. Image credit: John Cairns.


This achievement addresses the scalability problem in quantum computing, potentially paving the way for a future quantum internet that could offer ultra-secure communication and computation. It's a significant step towards making quantum computing practical on a large scale.

According to the study lead, Dougal Main, this is a significant advancement because previous demonstrations of quantum teleportation focused on transferring quantum states between physically separated systems while this study achieved the teleportation of logical gates (the fundamental components of quantum algorithms) across a network link.

Quantum teleportation is a fascinating process but is different from Science Fiction Teleportation. Science Fiction Teleportation is often depicted as the instantaneous transport of a person or object from one location to another. While Quantum Teleportation involves transferring quantum information from one location to another without physically moving the particles involved. 


It's important to note that quantum teleportation doesn't involve the physical transportation of particles themselves, just the transfer of their quantum state. Also, classical information must be sent alongside the quantum process, so it doesn't violate the speed of light limit.

In this study published in Nature, the team used quantum teleportation to create interactions between distant systems, allowing them to perform logical quantum gates between qubits housed in separate quantum computers. This effectively "wires together" distinct quantum processors into a single, fully-connected quantum computer.

The researchers developed a scalable architecture based on modules containing a small number of trapped-ion qubits (atomic-scale carriers of quantum information). These modules are linked together using optical fibers and photonic links (light-based data transmission) rather than electrical signals.

The photonic links enable qubits in separate modules to be entangled, allowing quantum logic to be performed across the modules using quantum teleportation. This means that logical operations can be executed between qubits housed in different quantum computers.

By linking multiple quantum processors, the researchers effectively created a distributed quantum computer. This approach addresses the scalability problem by allowing computations to be distributed across the network, potentially enabling the connection of millions of qubits.

The breakthrough could lay the groundwork for a future quantum internet, where distant processors form an ultra-secure network for communication, computation, and sensing.

Professor David Lucas, principal investigator of the research team and lead scientist for the UK Quantum Computing and Simulation Hub, led from the Department of Physics, said:
Our experiment demonstrates that network-distributed quantum information processing is feasible with current technology.


Scaling up quantum computers remains a formidable challenge that will likely require new physics insights and intensive engineering efforts over the coming years said professor Lucas. 

The researchers believe this breakthrough could lay the groundwork for a future quantum internet, which would offer an ultra-secure network for communications, computation, and sensing. The scalable architecture they developed uses modules containing a small number of trapped-ion qubits linked together via optical fibers. This modular approach could potentially overcome the scalability challenges faced by quantum computing.

It's an exciting development that brings us closer to realizing the full potential of quantum computing on a practical scale.

Kolkata Scientists at SNBNCBS Develop Unique Transistor for Faster, Greener Electronics

Kolkata Scientists at SNBNCBS Develop Unique Transistor for Faster, Greener Electronics

Scientists at the S. N. Bose National Centre for Basic Sciences (SNBNCBS), Kolkata, which is Under Department of Science and Technology (DST), Govt. of India, have developed a unique transistor using single molecules, controlled by mechanical forces rather than traditional electrical signals. This innovative approach, known as the mechanically controllable break junction (MCBJ), involves creating a sub-nanometer gap in a metal wire to accommodate a single molecule like ferrocene.

A transistor is a fundamental component in modern electronics, used to amplify or switch electronic signals and electrical power. It is used in various devices, including computers, smartphones, and amplifiers, making them crucial for modern technology.

The transistor's performance is influenced by the orientation of the ferrocene molecules between electrodes, which can either enhance or diminish electrical conductivity. This breakthrough could lead to advancements in quantum information processing, ultra-compact electronics, and low-power molecular devices.

Dr. Atindra Nath Pal and Biswajit Pabi, in collaboration with their team, conducted experiments and discovered that the orientation of ferrocene molecules between silver electrodes significantly affects the transistor's performance. Depending on the molecular orientation, the device can either enhance or diminish electrical conductivity through the junction, underscoring the importance of molecular geometry in transistor design.

Going forward, the scientists at SNBNCBS explored gold electrodes with ferrocene at room temperature. This combination resulted in a surprisingly low resistance, nearly five times the quantum of resistance (around 12.9 kΩ), but significantly lower than the typical resistance of a molecular junction (around 1 MΩ). This suggests the possibility of creating low-power molecular devices.

It's a breakthrough development for the future of electronics, making them faster and more energy-efficient.

How does the mechanically gated transistor work?

Kolkata Scientists at SNBNCBS Develop Unique Transistor for Faster, Greener Electronics
Mechnical gating response of Ferrocene molecule connected between two silver electrodes

The mechanically gated transistor operates by using mechanical forces to control the flow of electrical current through a single molecule. Here's a simplified breakdown of how it works:

1. Mechanically Controllable Break Junction (MCBJ): This technique involves creating a tiny gap in a metal wire, just large enough to fit a single molecule. This gap is controlled mechanically, often by bending the wire.

2. Single Molecule Placement: A molecule, such as ferrocene, is placed in this gap. The molecule acts as the active component of the transistor.

3. Mechanical Force Application: By applying mechanical forces (e.g., stretching or compressing the wire), the orientation and position of the molecule can be altered.

4. Conductivity Modulation: The electrical conductivity between the electrodes is influenced by the molecule's orientation. When the molecule is aligned in a certain way, it can either enhance or reduce the flow of electrons, effectively acting as a switch.

5. Signal Control: Unlike traditional transistors that use electrical signals to control current flow, this transistor uses mechanical forces, which can lead to lower power consumption and potentially faster switching speeds.

This innovative approach could pave the way for more energy-efficient and compact electronic devices.

molecular structure of Ferrocene
Molecular structure of Ferrocene



Indian-origin Techie Develops Technology to Charge Phones/Laptops in A Minute and EVs in 10 Minutes

Indian-origin Techie Develops Technology to Charge Phones/Laptops in A Minute and EVs in 10 Minutes

Ankur Gupta, an Indian-origin researcher and assistant professor of chemical and biological engineering at the University of Colorado Boulder, a public research university in Boulder, Colorado, United States, has made a significant breakthrough. His research has led to the development of a technology that can charge laptops and mobile phones in just one minute. Moreover, this technology also has the potential to charge electric vehicles in about 10 minutes.

The key to this innovation lies in the efficient movement of ions within a complex network of minuscule pores, which could lead to more efficient energy storage devices like supercapacitors. Supercapacitors are known for their rapid charging times and longer lifespans compared to traditional batteries. Gupta's work modifies Kirchhoff’s law, which has governed current flow in electrical circuits since 1845, by demonstrating how ions move due to both electric fields and diffusion.

This discovery is not only promising for personal electronic devices but also for power grids, where fluctuating energy demand requires efficient storage to avoid waste during periods of low demand and to ensure rapid supply during high demand¹. It's indeed an exciting development in the field of energy storage and could revolutionize how we charge our devices in the future.

The research is still in the development phase and has been published in the Proceedings of the National Academy of Science. Gupta's team at the University of Colorado Boulder has discovered how ions move within a complex network of minuscule pores, which is a significant step towards developing more efficient energy storage devices like supercapacitors

Technology

Ankur Gupta's technology is based on the efficient movement of ions within a complex network of microscopic pores, leading to rapid charging capabilities for devices like laptops, mobile phones, and electric vehicles.

Simplified Explanation of How Ankur Gupta's Tech works:

Supercapacitors: The technology utilizes supercapacitors, which are energy storage devices that store and release energy by accumulating ions in their pores.

Ion MovementUnlike traditional batteries, where ions move relatively slowly, Gupta's technology allows for a more efficient movement of ions. This is achieved by optimizing the flow within a complex structure of interconnected pores.

Charging Speed: By enhancing ion mobility, the charging process becomes much faster, allowing for a laptop or phone to be charged in just a minute and an electric vehicle in about 10 minutes.

Energy Storage: This method is not only beneficial for personal electronics but also for power grids, where efficient energy storage is crucial to handle fluctuating demands.

The breakthrough lies in modifying Kirchhoff’s law, which traditionally describes current flow in electrical circuits. Gupta's research demonstrates how ions move due to both electric fields and diffusion, which is a significant departure from the behavior described by Kirchhoff’s law in a single straight pore.

This discovery enables the simulation and prediction of ion flow in a complex network of thousands of interconnected pores within minutes, which was previously not possible¹. It's a leap forward in energy storage technology, promising faster and more efficient charging for a variety of applications.

The research is ongoing, and it may take some time before we see this technology implemented in everyday devices. However, the potential impact of such a technology on the market and our daily lives could be substantial, offering much faster charging times and longer-lasting energy storage solutions.

Source – Colorado.edu

Intel Makes AI Breakthrough with World’s Largest Neuromorphic System Inspired By Human Brain

Intel Makes AI Breakthrough with World’s Largest Neuromorphic System Inspired By Human Brain

Intel has recently made a significant breakthrough in the field of artificial intelligence (AI) with the creation of the world's largest neuromorphic system. This remarkable system, codenamed Hala Point, represents a major leap forward in sustainable AI research and development.

Neuromorphic systems are designed to imitate the electrical properties of real neurons, found in human brain, more closely, which could speed up computation and use less energy.

Intel's Hala Point is an advanced neuromorphic system designed to emulate the intricate workings of the human brain. It contains an impressive 1.15 billion neurons. To put this into perspective, that's more neurons than there are stars in our Milky Way galaxy!

Intel Makes AI Breakthrough with World’s Largest Neuromorphic System Inspired By Human Brain
Hala Point, contains 1.15 billion neurons for more sustainable Al. (Credit: Intel Corporation)

Intel Makes AI Breakthrough with World’s Largest Neuromorphic System Inspired By Human Brain

At the heart of Hala Point lies Intel’s Loihi 2 processor, a marvel of engineering. This processor is specifically designed for brain-inspired computing and enables efficient and scalable AI. It combines deep learning efficiency with novel brain-inspired learning and optimization capabilities.

Hala Point demonstrates state-of-the-art computational efficiencies on mainstream AI workloads. It can support up to 20 quadrillion operations per second (20 petaops) with an efficiency exceeding 15 trillion 8-bit operations per second per watts (TOPS/W) when executing conventional deep neural networks. These levels rival and even exceed architectures built on graphics processing units (GPUs) and central processing units (CPUs).

Intel Makes AI Breakthrough with World’s Largest Neuromorphic System Inspired By Human Brain
The Intel Neuromorphic Research Team pose for a photo with Hala Point (from left): Patricio Martinez, platform hardware design engineer, Eduardo Quijano Centeno, lead platform hardware design engineer, Gerardo Peralta Francisco, platform hardware designer, and Leobardo Campos Macias, Al applied research scientist. (Credit: Intel Corporation)

Applications:

Hala Point's unique capabilities open up exciting possibilities for real-time continuous learning in various AI applications. These include:
  • Scientific and Engineering Problem-Solving: Researchers can leverage Hala Point for solving complex scientific and engineering challenges.
  • Logistics and Smart City Infrastructure Management: The system can enhance logistics and optimize smart city operations.
  • Large Language Models (LLMs): Hala Point could contribute to the development of more powerful language models.
  • AI Agents: It has the potential to improve AI agents' adaptability and efficiency.
Initially deployed at Sandia National Laboratories, Hala Point will support advanced brain-scale computing research. Scientists will focus on solving problems related to device physics, computer architecture, computer science, and informatics. In essence, Hala Point represents a critical step toward more sustainable and efficient AI technology.

This achievement by Intel underscores the importance of brain-inspired computing and its potential impact on the future of AI. With Hala Point, we're moving closer to unlocking new frontiers in artificial intelligence.

"The computing cost of today’s AI models is rising at unsustainable rates. The industry needs fundamentally new approaches capable of scaling. For that reason, we developed Hala Point, which combines deep learning efficiency with novel brain-inspired learning and optimization capabilities. We hope that research with Hala Point will advance the efficiency and adaptability of large-scale AI technology," Mike Davies, director of the Neuromorphic Computing Lab at Intel Labs.

Conventional Al systems, including those based on deep learning, rely on silicon-based computer architectures (such as CPUs and GPUs). These architectures were originally designed for general-purpose computing and do not directly mimic the brain's structure.

Neuromorphic computing, on the other hand, emulates the human brain's mechanisms within its architecture. It takes inspiration from the brain's neural networks, neurons, and synapses. The goal is to create hardware that operates more like the brain, enabling efficient and brain-inspired computation

In A Breakthrough, NTT Develops AI that Can Answer All Kinds of Questions Based on Document Images

In A Breakthrough, NTT Develops AI that Can Answer All Kinds of Questions Based on Document Images

Realize LLM-based visual machine reading comprehension technology

Towards "tsuzumi" that can read and understand visual documents

NTT Corporation has made significant progress in the field of artificial intelligence (AI) with their LLM-based visual machine reading comprehension technology. This breakthrough aims to enable AI systems to answer a wide range of questions based on document images, which is crucial for digital transformation (DX).

Real-world documents often contain both text and visual elements (such as icons, diagrams, etc.). However, existing AI models, including large language models (LLMs), primarily focus on understanding text information.

To address this limitation, NTT proposed Visual Machine Reading Comprehension Technology. The goal was to create an AI system that can read and understand visual documents/ information, similar to how humans do.

Comparison of Text-based and Visual Machine Reading Comprehension.
 Comparison of Text-based and Visual Machine Reading Comprehension.

Previous visual machine reading comprehension techniques struggled with arbitrary tasks, such as information extraction from invoices. Achieving high performance without extensive training data was challenging.

NTT aimed to develop a visual machine reading comprehension model with high instruction-following ability, akin to LLMs.

NTT successfully developed a new visual machine reading comprehension technology that leverages the reasoning ability of LLMs.

The model visually understands documents by analyzing both text and visual information. It can answer complex questions involving diagrams, such as understanding pie charts or other visual representations.

The research results were presented at the 38th Annual AAAI Conference on Artificial Intelligence and received the Outstanding Paper Award at the 30th Annual Conference of the Association for Natural Language Processing.

Notably, this paper is the first to propose a specific methodology for LLM-based visual document understanding.

Tsuzumi


NTT's large language model, called 'Tsuzumi' plays a central role in this technology. Tsuzumi is designed to address the energy consumption challenges associated with large-scale LLMs. It aims to reduce learning and inference costs while maintaining high performance.

The name "Tsuzumi" symbolizes the start of a Gagaku (ancient Japanese court music and dance) ensemble, emphasizing its role in driving industrial development.


Technology

NTT's visual machine reading comprehension technology visually understands documents by utilizing the high reasoning ability of LLMs (Figure below). To achieve this goal, (1) NTT researchers developed a new adapter technology5 that can convert document images into LLM's representations, and (2) constructed the first large-scale visual instruction tuning datasets for diverse visual document understanding tasks. These enable LLMs to understand the content of documents by combining vision and language information and to perform arbitrary tasks without additional training.

Overview of LLM-based Visual Machine Reading Comprehension Technology.
Overview of LLM-based Visual Machine Reading Comprehension Technology.

LLMs with NTT's technology can be used for office works and daily life situations that require human cognition tasks, such as searching and screening documents, and assisting in reading specialized literature.

In a conclusion, NTT's breakthrough in LLM-based visual machine reading comprehension technology brings us closer to AI systems capable of understanding and answering questions based on visual documents—a critical step in the digital transformation journey.

This result is the outcome of joint research with Professor Jun Suzuki in Center for Data-driven Science and Artificial Intelligence Tohoku University in FY2023.

This technology will contribute to the development of important industrial services such as web search and question answering based on real-world visual documents. We aim to establish the technology to realize AI that creates new values by collaborating with humans, including work automation.

Qualcomm Unveils Breakthrough in Wi-Fi Tech and New AI-Ready IoT and Industrial Platforms

Qualcomm Unveils Breakthrough Wi-Fi Tech and New AI-Ready IoT and Industrial Platforms

At the Embedded World 2024 exhibition and conference in Nuremberg, Qualcomm made significant announcements related to IoT and Wi-Fi Technology.

The company unveils the Qualcomm® QCC730, a disruptive micro-power Wi-Fi system for IoT connectivity. This technological breakthrough provides up to 88% lower power than previous generations and can revolutionize products in battery powered industrial, commercial and consumer applications.

Qualcomm also introduced the new Qualcomm RB3 Gen 2 Platform — a comprehensive hardware and software solution designed for IoT and embedded applications. Utilizing the Qualcomm® QCS6490 processor, the RB3 Gen 2 offers a combination of high-performance processing, 10x increase in on-device AI processing, support for quadruple 8MP+ camera sensors, computer vision, and integrated Wi-Fi 6E.

The two technologies unveiled are —

1. Breakthrough Wi-Fi Technology

The Qualcomm QCC730 is a groundbreaking micro-power Wi-Fi system tailored for IoT connectivity. Here are its key features:

Unprecedented Low Power Consumption for Extended Battery Life
  • The QCC730 delivers up to 88% lower power consumption compared to previous generations, potentially revolutionizing battery-powered industrial, commercial, and consumer applications. 
  • Its selectable power modes and innovative power management maximize savings for extremely long battery life.
Flexibility to Operate in Hostless or Hosted Mode
  • Developers can choose between hostless or hosted mode, providing extreme flexibility for different use cases.
  • It supports internal or external power amplifiers and has integrated, non-volatile memory (RRAM) for ease of design.
Versatile System Integration for Ease of Design
  • QCC730 fully integrates the on-chip microcontroller, non-volatile memory (NVM), and SRAM, making it versatile and easy to design with.
  • Developers can replace or integrate it with traditionally Bluetooth-only applications.
Complete Cloud Connectivity Stack with Open-Source Software SDK:

QCC730 comes with an open-source software SDK available on CodeLinaro, allowing cloud connectivity offloading.

It empowers developers with an alternative to traditionally Bluetooth-only applications and enables direct cloud connectivity.

In summary, the Qualcomm QCC730 offers ultra-low micro-power Wi-Fi, scalability, and versatility for IoT applications, making it a powerful choice for connected devices.

2. AI Ready IoT & Industrial Platform

The Qualcomm RB3 Gen 2 Platform is designed to empower developers in creating a broad range of IoT products across various segments, including consumer, enterprise, industrial robotics, and automation.

High-Performance Processing
  • The RB3 Gen 2 Platform is powered by the Qualcomm QCS6490 processor, offering advanced performance capabilities.
  • It combines powerful AI processing, computer vision, and blazing-fast wireless connectivity.
On-Device AI Capabilities
  • Developers can leverage the platform's AI acceleration for tasks such as image and video capture, enhancing overall functionality.
  • This opens up possibilities for applications in workplace security, automation, and more.
Support for Multiple Camera Sensors
  • The platform supports multiple 8MP+ camera sensors, enabling robust computer vision capabilities.
  • This feature is essential for applications in robotics, industrial automation, and other visual tasks.
Integrated Wi-Fi 6E
  • With built-in Wi-Fi 6E, the RB3 Gen 2 Platform ensures high-speed wireless connectivity.
  • This connectivity is crucial for seamless communication in IoT devices.
Bluetooth 5.2 and LE Audio
  • Enhanced Bluetooth 5.2 support allows for wireless accessories and efficient communication.
  • LE Audio ensures improved audio quality for connected devices.
Expandability and Versatility
  • The platform offers expandability through interfaces such as GPIOs, I2C, SPI, UART, PCIe, USB, MIPI CSI/DSI, and SDIO.
  • Developers can easily integrate additional components and customize their solutions.
In summary, the Qualcomm RB3 Gen 2 Platform provides a powerful foundation for creating innovative IoT products, combining performance, AI capabilities, and connectivity. 

For the 1st Time Scientists Found Experimental Evidence of Graviton-like Particle

For the 1st Time Scientists Found Experimental Evidence of Graviton-like Particle

Gravitons are fascinating hypothetical particles that play a pivotal role in our understanding of gravity. These are the fundamental particles that mediate the force of gravitational interaction in the realm of quantum field theory.

In simpler terms, they carry the gravitational force, much like how photons carry the electromagnetic force. When you toss something upward, and it gracefully descends due to gravity, it's essentially the gravitons at work.

Like photons, gravitons are expected to be massless and electrically uncharged. Gravitons too travel at the speed of light, zipping through the fabric of spacetime. Their existence is rooted in the quest for a unified theory that combines quantum mechanics and gravity.

Gravitons are the focus of the search for the "theory of everything", which would unify Einstein's General Relativity (GR) theory of gravity with quantum theory

Gravitons remain elusive and unobserved and continue to intrigue scientists as we seek to unravel the mysteries of gravity and the cosmos.

In a latest however, scientists have glimpsed into graviton-like particles and these particles of gravity have shown their existence in a semiconductor.

An international research team led by Chinese scientists has, for the first time, presented experimental evidence of a graviton-like particle called chiral graviton modes (CGMs), with the findings published in the scientific journal Nature on Thursday.

By putting a thin layer of semiconductor under extreme conditions and exciting its electrons to move in concert, researchers from eastern China’s Nanjing University, the United States and Germany found the electrons to spin in a way that is only expected to exist in gravitons.

Despite the breakthrough, Loren Pfeiffer at Princeton University, who wrote the paper of this findings, said "This is a needle in a haystack [finding]. And the paper that started this whole thing is from way back in 1993." He wrote that paper with several colleagues including Aron Pinczuk, who passed away in 2022 before they could find hints of the gravitons.

The discovery of chiral graviton modes (CGMs) and their shared characteristics with gravitons, a still-undiscovered particle predicted to play a critical role in gravity, could potentially connect two subfields of physics: high-energy physics, which operates across the largest scales of the universe, and condensed matter physics, which studies materials and the atomic and electronic interactions that give them their unique properties.

Scientists in China, the US and Germany used polarised laser light to measure graviton-like excitation and spin in a quantum material. (Image - SCMP.org)

The ability to study graviton-like particles in the lab could help fill critical gaps between quantum mechanics and Einstein’s theories of relativity, solving a major dilemma in physics and expanding our understanding of the universe.

The term "graviton" was coined in 1934 by Soviet physicists Dmitrii Blokhintsev and F. M. Gal'perin. Paul Dirac later reintroduced the term, envisioning that the energy of the gravitational field should come in discrete quanta—these quanta he playfully dubbed "gravitons."

Just as Newton anticipated photons, Laplace also foresaw "gravitons," albeit with a greater speed than light and no connection to quantum mechanics or special relativity.

Market Reports

Market Report & Surveys
IndianWeb2.com © all rights reserved