Oracle

Oracles are essential infrastructure components that feed real-time, off-chain data (such as price feeds, weather, or sports results) into blockchain smart contracts. Without decentralized oracles like Chainlink and Pyth, DeFi could not function. In 2026, oracles have evolved to support verifiable randomness and cross-chain data synchronization. This tag covers the technical evolution of data availability, tamper-proof price feeds, and the critical role oracles play in ensuring the deterministic execution of complex decentralized applications.

5209 Articles
Created: 2026/02/02 18:52
Updated: 2026/02/02 18:52
Securitize Integrates with Ripple to Boost Instant Liquidity for Tokenized Treasury

Securitize Integrates with Ripple to Boost Instant Liquidity for Tokenized Treasury

Securitize and Ripple set to enable instant RLUSD liquidity for tokenized treasuries to streamline institutional finance and boost DeFi access.

Author: Blockchainreporter
Phase 6 Entry Positions Ozak AI Among Top-Performing AI Tokens

Phase 6 Entry Positions Ozak AI Among Top-Performing AI Tokens

A lot of excitement is currently brewing in the crypto space as new entrant Ozak AI crosses over to stage six of its presale, with the price kept at $0.012. With this new achievement, the project now sits firmly in the spotlight as a top contender in the AI token market. What makes this moment […] The post Phase 6 Entry Positions Ozak AI Among Top-Performing AI Tokens appeared first on Live Bitcoin News.

Author: LiveBitcoinNews
Exploring zkTLS As A Way To Build A Verifiable and Private Web3

Exploring zkTLS As A Way To Build A Verifiable and Private Web3

Today the world has become heavily digital-first even as AI and AI-adjacent integrations impact all our interactions and experiences. Privacy and security concerns have become more pressing now than ever before. Among the emerging technologies that address and try to deal with all this, Zero-Knowledge Transport Layer Security or zkTLS has caught the attention. Let’s take a deep dive. What is zkTLS? As the name suggests this is a hybrid protocol combining two components: zk: Refers to one of the most popular and highly effective privacy-preserving technique in use in blockchain technology — zero-knowledge proofs (ZKPs). It is a cryptographic method involving two parties, where the prover convinces the verifier that a piece of information is known without having to reveal it. TLS: Refers to a critical part of HTTPS (Hypertext Transfer Protocol Secure) providing encryption and authentication mechanisms to secure data transmission between client and server. Fun fact: Not all implementations of TLS attestations use ZKPs as focus is on verifiability rather than mere privacy, but still the name zkTLS has etched its name as one of crypto’s newest privacy primitives. Bottomline: In tandem with confidential computing, zkTLS enables data provenance and encryption, even tapping into previously unusable data. Oasis, with a focused privacy-first approach and production-ready confidential EVM, Sapphire, has been working with leading zkTLS projects, including PoCs, e.g. onboarding Reclaim Protocol with its ecosystem. How zkTLS works? In simple terms, it allows a user or a server to demonstrate that data fetched via a TLS-secured connection, like an API call to a bank’s server, is authentic, and no extra information is exposed in the process. So, zkTLS will generate a proof like zk-SNARK confirming that data was fetched from a specific server (identified by its public key and domain) via a legitimate TLS session, without exposing the session key or plaintext data. The process flow is something like this: The client and the server connect over TLS (“TLS handshake”), establishing a secure session with encryption and server authentication. zkTLS captures session details (e.g., encrypted data and server certificate) and processes them in a zk-SNARK circuit tailored to TLS constraints. The circuit output will produce the proof verifying the data’s authenticity and source, keeping sensitive details hidden. This proof is recorded on a blockchain for decentralized verification. Let’s now take a quick look at the models. MPC-based Here, zkTLS modifies the standard TLS handshake by introducing a network of nodes that collaborate to produce a multi-party key replacing the browser-generated key. With browser consulting these nodes to generate a shared key through an MPC protocol, it is ensured no single party knows the entire key. The shared key is used for encrypting and decrypting requests and responses as the browser and all nodes cooperate on every instances of operation. This model enhances security but the the trade-off is networking complexity and overhead due to persistent node coordination. TEE-based Here, zkTLS leverages Trusted Execution Environments — tamper-proof secure enclaves within CPUs that act like a black box and can securely handle HTTPS requests. All sensitive data such as authentication tokens are encrypted and sent to the service provider’s TEE, where decryption happens internally without any exposure to the provider or external systems. The TEE logs in on behalf of the user and securely processes the response, providing cryptographic guarantees about the integrity of the request and response. This model is very efficient but the trade-off is dependency on TEE hardware and trust reliance on manufacturer security, e.g. Intel SGX or TDX. Proxy-based Here, zkTLS uses HTTPS proxies as intermediaries which forward encrypted traffic between the browser and the website, and then observe the data exchange. It is the proxy that provides attestations about the encrypted requests and responses, confirming they originated from the browser or the website. Finally, the browser generates a ZKP allowing decryption of the received data, and since the shared key is not revealed, privacy is ensured. This model eliminates the trade-offs of the other two models but has its own challenge — having to trust that the proxy is not malicious. Key takeaways of zkTLS zkTLS is a game-changer for web3 and its implications are best understood when we understand the two-pronged problem is solves. For a web2 user, HTTPS means there is end-to-end encryption. However, this isn’t provable. Also, TLS itself is unverifiable. And, no privacy is guaranteed. zkTLS brings verifiability to the table as the proof it generates validates the data or its origin and verifies the transfer. Another benefit of this technology is data privacy. To those who are thinking this is just like pulling API data and putting it on-chain, the distinction is tangible. APIs can be easily disabled, but with an ongoing HTTPS connection, zkTLS ensures continuous data access. Simply stated, this enables any web2 data to be used on a blockchain in a verifiable and permissionless way. Key use cases of zkTLS in crypto DeFi Lending Real world example: 3Jane Identity Verification Real world example: Nosh Privacy-Preserving Oracles Real world example: TLS Notary Verifiable Airdrops Real world example: ZKON Final word on zkTLS is that its design space is vast and full of potential as it evolves by solving current challenges like scalability, compatibility with varied web systems, and dependence on existing oracle networks. But the promise is real as indicated by the various real world examples, already in production with many more being explored. And the result we have been seeing and, as the space grows and evolves, look forward to gives hope that web2 — web3 interactions between the internet and the blockchain would also drive mass adoption. Resources: Oasis blog Reclaim blog Oasis x Reclaim Originally published at https://dev.to on September 23, 2025. Exploring zkTLS As A Way To Build A Verifiable and Private Web3 was originally published in Coinmonks on Medium, where people are continuing the conversation by highlighting and responding to this story

Author: Medium
Sam Altman says concerns over his $850 billion OpenAI expansion are valid

Sam Altman says concerns over his $850 billion OpenAI expansion are valid

The post Sam Altman says concerns over his $850 billion OpenAI expansion are valid appeared on BitcoinEthereumNews.com. Sam Altman says people are right to be worried about the size of OpenAI’s new expansion, but he isn’t backing down from it. Speaking from a construction site in Abilene, Texas, where OpenAI is building its first mega data center, the CEO told reporters on Tuesday that the $850 billion infrastructure rollout is necessary. “People are worried. I totally get that,” Sam said. “We are growing faster than any business I’ve ever heard of before.” As Cryptopolitan reported, OpenAI is now committed to building data centers powered by 17 gigawatts of energy, roughly the same output as 17 nuclear plants or nine Hoover Dams. The electricity load alone could power more than 13 million U.S. homes. Each site costs about $50 billion, and in total the buildout is almost half of the global $2 trillion AI infrastructure push forecasted by HSBC. Sam explained that the scale is just a response to a huge spike in demand. Over the last 18 months, ChatGPT usage has jumped 10x. To handle it, Sam said OpenAI needs an entire network of supercomputing sites. “This is what it takes to deliver AI,” he said. “Unlike earlier versions of the internet, this requires massive infrastructure. And this is only a fraction of it.” Partners lock in funding, power, and leadership to meet AI demand The biggest issue isn’t money or chips, according to Sam, but power. “Electricity is the constraint,” he said. He led a $500 million funding round for Helion Energy, a fusion firm building a test reactor, and also helped take fission startup Oklo public through his own SPAC. Not everyone’s convinced. Critics say the whole setup smells like a bubble. Companies tied to OpenAI, like Nvidia, Oracle, Microsoft, and Broadcom, have seen trillions in value added. Nvidia and Microsoft alone are now worth…

Author: BitcoinEthereumNews
Sam Altman confirmed OpenAI's $850 billion expansion is real and driven by massive AI demand

Sam Altman confirmed OpenAI's $850 billion expansion is real and driven by massive AI demand

Sam Altman says people are right to be worried about the size of OpenAI’s new expansion, but he isn’t backing down from it. Speaking from a construction site in Abilene, Texas, where OpenAI is building its first mega data center, the CEO told reporters on Tuesday that the $850 billion infrastructure rollout is necessary. “People […]

Author: Cryptopolitan
The three major U.S. stock indexes all fell, with Nvidia down nearly 3% and Oracle down 4.3%.

The three major U.S. stock indexes all fell, with Nvidia down nearly 3% and Oracle down 4.3%.

PANews reported on September 24th that U.S. stocks closed on Tuesday, with the Dow Jones Industrial Average initially closing down 0.19%, the S&P 500 down 0.55%, and the Nasdaq down 0.95%. Nvidia (NVDA.O) fell nearly 3%, Oracle (ORCL.N) fell 4.3%, and TSMC (TSM.N) rose 3.7%.

Author: PANews
OpenAI reveals Stargate AI facility in Texas, projects planned in five more states

OpenAI reveals Stargate AI facility in Texas, projects planned in five more states

The post OpenAI reveals Stargate AI facility in Texas, projects planned in five more states appeared on BitcoinEthereumNews.com. OpenAI revealed plans Tuesday to build six large computer facilities across the country, adding to its current Texas location as part of a massive $500 billion spending plan that President Donald Trump highlighted earlier this year. The company behind ChatGPT is working with Oracle and Softbank through a partnership called Stargate to construct these new sites. Two more will go up in Texas, with additional locations planned for New Mexico, Ohio, and another Midwest state that hasn’t been named yet. The biggest project sits in Abilene, Texas, where city leaders say the development is changing their old railroad community. Oracle officials who toured the eight-building site said it’s already set to become the world’s largest computer cluster for artificial intelligence work once construction wraps up. The complex will house hundreds of thousands of specialized computer chips. Sam Altman from OpenAI admitted that most people don’t think about what happens behind the scenes when they use ChatGPT. He and Clay Magouyrk, Oracle’s new co-chief, talked about their efforts to limit environmental damage in this dry part of West Texas, where temperatures reached 97 degrees Tuesday. “We’re burning gas to run this data center,” Altman said, though he noted that Stargate hopes to use different power sources as the project grows. The Texas facility needs roughly 900 megawatts of electricity to run all eight buildings and their computer chips. One building is already working, and a second that the executives visited Tuesday is almost done. Each equipment rack holds 72 of Nvidia’s GB200 chips, which handle the most demanding artificial intelligence tasks. Each building should contain about 60,000 of these chips. Residents have mixed reactions to the OpenAI project Not all residents are pleased because of the facility’s water and power requirements. The city’s water reservoirs were about half-full this week. People…

Author: BitcoinEthereumNews
Robot Swarms Could Solve Blockchain’s Oracle Problem, Researchers Say

Robot Swarms Could Solve Blockchain’s Oracle Problem, Researchers Say

New study shows mobile robots can collectively verify real-world data for smart contracts while resisting attacks.

Author: Coinstats
OpenAI announces five new Stargate data center sites

OpenAI announces five new Stargate data center sites

PANews reported on September 24 that OpenAI announced that its "Stargate" data center infrastructure project will add five new sites in the next three years, providing a total of 7 gigawatts of power capacity. OpenAI announced the news at its flagship site in Abilene, Texas, with partners including Oracle (ORCL.N) and SoftBank. The new sites will be located in Lordstown, Ohio, Shackelford and Milam, Texas, Doña Ana, New Mexico, and an undisclosed location in the Midwest. OpenAI said in a blog post that the company is evaluating more sites, and these new sites allow the company to complete its original plan ahead of schedule, and it expects to lock in a $500 billion investment commitment of 10 gigawatts of power by the end of 2025.

Author: PANews
OpenAI’s Monumental Stargate Expansion: Fueling the Future of AI

OpenAI’s Monumental Stargate Expansion: Fueling the Future of AI

BitcoinWorld OpenAI’s Monumental Stargate Expansion: Fueling the Future of AI The world of artificial intelligence is experiencing an unprecedented surge, with innovations emerging at a breathtaking pace. For those tracking the digital frontier, including the dynamic cryptocurrency space, the foundational infrastructure supporting these advancements is just as crucial as the breakthroughs themselves. Imagine the sheer computational muscle required to power the next generation of AI models, the kind that could redefine industries and even human interaction. This is precisely what OpenAI is gearing up for, embarking on an ambitious journey to construct a network of colossal AI data centers known as Project Stargate. This massive undertaking, backed by industry giants Oracle and SoftBank, signals a new era in AI development, promising to reshape the technological landscape as we know it. What is Project Stargate and Why Does it Matter? Project Stargate is not just another data center initiative; it’s a vision for a supercomputing backbone designed to meet the insatiable demands of advanced AI. At its core, Stargate represents OpenAI‘s commitment to pushing the boundaries of artificial general intelligence (AGI). The sheer scale of this project is astounding, with a planned capacity of seven gigawatts. To put that into perspective, seven gigawatts is enough energy to power more than five million homes. This monumental energy requirement underscores the immense computational needs of training and deploying increasingly complex AI models, from large language models to advanced generative AI. For anyone invested in the future of technology, understanding the infrastructure behind AI breakthroughs is key. Stargate isn’t just about housing servers; it’s about creating an ecosystem where AI can truly flourish, enabling faster training times, more sophisticated algorithms, and ultimately, more powerful and versatile AI applications. The implications of such a project are far-reaching: Accelerated AI Development: With vastly increased computational power, OpenAI can iterate on models more rapidly, leading to quicker advancements. New Capabilities: More powerful infrastructure enables the development of AI models with capabilities previously deemed impossible. Economic Impact: The construction and operation of these facilities will create jobs, stimulate local economies, and attract further investment in tech hubs. Global Competitiveness: Such infrastructure solidifies the position of the United States as a leader in AI innovation. The Power Players: Oracle, SoftBank, and OpenAI‘s Strategic Alliance The success of an undertaking as grand as Project Stargate hinges on strategic partnerships. OpenAI has meticulously chosen its collaborators, tapping into the strengths of two industry titans: Oracle and SoftBank. These alliances are not merely financial; they represent a confluence of technological expertise, global reach, and a shared vision for the future of AI. Oracle‘s Cloud Computing Prowess Oracle, a long-standing giant in enterprise software and cloud infrastructure, brings its robust cloud computing capabilities and expertise in building and managing large-scale data centers. Three of the five new Stargate sites are being developed in collaboration with Oracle, leveraging their Oracle Cloud Infrastructure (OCI) platform. OCI is known for its high performance, scalability, and security, making it an ideal foundation for demanding AI workloads. Oracle’s commitment to supporting cutting-edge AI initiatives aligns perfectly with OpenAI‘s goals, providing the necessary hardware, network, and operational excellence to handle the unprecedented data flows and processing power required. This partnership signifies Oracle’s aggressive push into the AI infrastructure market, positioning them as a critical enabler for the next wave of AI innovation. SoftBank‘s Visionary Investment and Global Reach SoftBank, the Japanese multinational conglomerate known for its extensive investments in technology and telecommunications, is partnering with OpenAI on the remaining two Stargate sites. SoftBank’s involvement is strategic, providing not only significant capital but also its vast network and experience in large-scale infrastructure projects. SoftBank’s vision often involves identifying and nurturing disruptive technologies, and its investment in Stargate underscores the firm’s belief in the transformative power of AI. Their global perspective and ability to mobilize resources across diverse sectors will be invaluable in navigating the complexities of such a massive buildout. This collaboration highlights a shared understanding that foundational infrastructure is paramount for unlocking AI’s full potential. The synergy between these three entities is undeniable: OpenAI: The AI innovator, driving the demand for advanced compute. Oracle: The infrastructure provider, offering scalable and high-performance cloud solutions. SoftBank: The strategic investor and enabler, providing capital and global project expertise. Unpacking the Scale: The Immense Need for AI Data Centers The rapid evolution of AI models, particularly large language models (LLMs) and generative AI, has created an insatiable demand for computational resources. Training these models requires vast amounts of data and billions of parameters, necessitating an unprecedented scale of hardware and energy. This is where the concept of ‘gigawatt-scale’ AI data centers becomes critical. The existing infrastructure, while powerful, simply isn’t sufficient to handle the future needs of AI development. Consider these factors driving the demand: Model Complexity: AI models are growing exponentially in size and complexity, requiring more processing power and memory. Data Volume: Training data sets are enormous, demanding massive storage and high-speed data transfer capabilities. Inference at Scale: Once trained, these models need to serve millions of users, requiring efficient and powerful inference capabilities. Continuous Innovation: The pace of AI research means constant experimentation and retraining, which consumes vast resources. The $100 billion investment from Nvidia, mentioned in the original report, further illustrates this point. This capital is earmarked specifically for acquiring Nvidia’s cutting-edge AI processors – the very chips that power these advanced models – and building out even more AI data centers. It’s a clear signal that the race for AI dominance is fundamentally a race for computational infrastructure. Companies like OpenAI understand that without robust, scalable, and energy-efficient data centers, the theoretical advancements in AI cannot translate into practical, impactful applications. Geographic Footprint: Where are OpenAI‘s New Hubs Located? The strategic placement of these new AI data centers is crucial for operational efficiency, energy access, and disaster recovery. OpenAI‘s expansion includes five new sites across the United States, carefully selected for their potential to support such large-scale operations. Here’s a breakdown of the announced locations: Partner Location Notes Oracle Shackelford County, Texas Leveraging Texas’s energy resources and growing tech infrastructure. Oracle Doña Ana County, New Mexico Strategic location in the Southwest, potentially benefiting from renewable energy initiatives. Oracle Undisclosed location in the Midwest Likely chosen for access to power grids and potentially cooler climates for efficient cooling. SoftBank Lordstown, Ohio Utilizing existing industrial sites and energy infrastructure in the region. SoftBank Milam County, Texas Another significant presence in Texas, highlighting its strategic importance for data center development. These locations are not chosen at random. Factors such as access to reliable and affordable power, fiber optic networks, skilled labor, and favorable regulatory environments play a significant role. Texas, for instance, has become a hotbed for data center development due to its energy grid and pro-business environment. The expansion into the Midwest and Southwest also suggests a strategy of geographical diversification, reducing risks associated with localized natural disasters or energy disruptions. This distributed approach ensures redundancy and resilience, critical for maintaining continuous AI operations. Fueling the Future: The Economic and Technological Impact of Stargate The construction of these new Stargate data centers is more than just an infrastructure project; it’s an investment in the future of technology itself. The economic impact will be substantial, from the creation of thousands of construction and operational jobs to stimulating local economies in the chosen regions. Beyond direct economic benefits, the technological implications are profound. For the broader tech ecosystem, including areas like blockchain and decentralized finance (DeFi), the advancements enabled by Stargate could lead to: More Sophisticated AI Tools: Enhanced AI capabilities could improve fraud detection, predictive analytics, and algorithmic trading in financial markets. Improved Security: AI can play a critical role in cybersecurity, protecting digital assets and infrastructure. Innovation in Web3: Advanced AI could power more intelligent smart contracts, decentralized applications (dApps), and user experiences in the Web3 space. Energy Considerations: The immense power consumption of these centers also brings a focus on sustainable energy solutions, a conversation relevant to all high-energy-consumption industries, including cryptocurrency mining. The challenges are also significant. The sheer energy demand of seven gigawatts raises environmental concerns, pushing for innovations in renewable energy sources and energy efficiency. Cooling these massive facilities efficiently will be another engineering feat. However, the benefits of accelerating AI development are seen as outweighing these challenges, driving further investment in green tech and sustainable practices within the data center industry. Conclusion: A New Dawn for AI Infrastructure OpenAI‘s Stargate project, backed by the formidable resources of Oracle and SoftBank, marks a pivotal moment in the history of artificial intelligence. By committing to build five new gigawatt-scale AI data centers, OpenAI is not just expanding its capacity; it is laying the groundwork for the next generation of AI innovation. This monumental undertaking will provide the computational horsepower necessary to train and deploy models that could redefine industries, from healthcare and finance to education and entertainment. While challenges like energy consumption and environmental impact remain, the strategic partnerships and massive investment underscore a collective belief in the transformative power of AI. As these Stargate hubs come online, they will not only solidify OpenAI‘s leadership but also accelerate the entire AI ecosystem, promising a future where intelligent machines play an even more integrated and sophisticated role in our lives. To learn more about the latest AI market trends, explore our article on key developments shaping AI models features, institutional adoption, etc. This post OpenAI’s Monumental Stargate Expansion: Fueling the Future of AI first appeared on BitcoinWorld.

Author: Coinstats