The British have essentially had a complete mental breakdown because of this! (they couldn’t handle the power of the Sun).
No offence Israel, but I’m coming to Tel Aviv… to Jerusalem, and I’m going to get Biblical on your arses! 😀
“For the first time in human history, the power of the Sun has been brought down to the surface of planet Earth”
“The most important discovery since fire”
I have NEVER been so serious of anything in my life… Israel, take all those Quantum Physics books and (don’t burn them) but put them to one side for a moment… take physics back one hundred years and start a fresh… read Randell Mills’ work… read The Grand Unified Theory of Classical Physics… study it in depth, from it’s inception, it’s history, it’s continual evolution… the background to hydrino energy… the implications it has in EVERY of discipline of science (and philosophy).
This discovery is the greatest in recent human history… anyone that grasps it, and understands it, masters it, can propel technology and science forward in ways we can not yet imagine.
EVERYTHING from energy, physics, cosmology, chemistry, genetics, molecular biology, drug development, ‘anti-gravity’, synthetic materials, electronics… there is NOTHING this is not going to effect… aviation, space exploration, mineral extraction, medicine…
I’ve exhausted myself… I don’t know what else to say. Israel… apply your best and brightest to this, whilst everyone else is pursuing ‘Quantum’… you will emerge as the technological powerhouse of the next hundred years… a technological superpower… unrivalled and untouchable. (sorry to those that disagree but there are military applications, I imagine Israel is already thinking the same!)
It’s the greatest gift you have since… … the Ark of the Covenant! 😀 (you can build yourself a new one! A 21st century Ark of the Covenant!)
Other inventions/patents from Dr Randell Mills ranging from genetic sequencing to selective drug delivery to Resonant magnetic susceptibility imagine (ReMSI)… the guys a genius!… https://www.google.com/search?q=randell+mills …
“When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed.”
£1302.00 … for a one day conference! No wonder these people rule the world!
Leadership in a climate of disruptive change
18 March 2019 – 9:30am 5:30pm
Chatham House, London
Energy Transitions 2019
New Actors, New Technologies, New Business Models
18 March 2019 – 9:30am to 5:30pm
Chatham House, London
A global shift in the energy sector is under way with the rise of renewable energy sources spearheaded by their dominance of investment in the power sector. This is leading to disruptive change as the greater deployment of renewables and many associated technologies, such as storage, are challenging existing business models and threatening the market dominance of the existing actors. At the same time investment in fossil fuels has stabilized, as a slowdown of the financing of coal has been balanced by modest increases in spending in upstream oil and gas.
New global trends, electrification of new sectors such as transport and heating, along with the provision of modern energy services to over a billion people lacking access could further disrupt the energy sector, and the future impacts of these transitions on global energy security and sustainable transitions globally remain unclear.
Therefore, now, more than ever, it is critical that policy-makers and business leaders re-evaluate current and future strategies for delivering the domestic and international energy transition. The fourth annual Chatham House Energy Transitions conference will examine the new drivers of change, focusing on how different economies and industries can make the shift to a low-carbon energy future. Key questions to be explored include:
What will incentivize an acceleration in decarbonization and drive low-carbon innovation?
How can new technologies be deployed to transform grid interaction and enhance connectivity?
What are the implications of the changing policy environment for low-carbon investment?
How do disruptive shifts in the energy sector affect the prospects for enhancing access to clean, safe and sustainable energy in developing countries?
The Chatham House Rule
To enable as open a debate as possible, this conference will be held under the Chatham House Rule.
Achieving the Paris Agreement’s goal of limiting temperature increases to ‘well below’ 2°C requires environmental leadership to rapidly emerge within the world’s centres of economic policymaking: treasuries, finance ministries and ministries of economy and business.
Plenary Session at the Waddesdon Club 2018 annual meeting
The urgency of climate change dictates that the next generation of leaders must deliver the economic transformation needed; these individuals need to understand how climate and environment challenges will affect their time in power and define their legacies.
The Waddesdon Club is Chatham House’s response – through engaging future leaders, it seeks to equip them with the necessary tools, concepts, language, and capacities for influence needed to advance a mainstream economic agenda for climate change and sustainable development. Core to this approach is an annual retreat at Waddesdon Manor, offering a unique opportunity for participants to deepen their knowledge; widen their peer network, including meeting leading international experts; and share their respective perceptions, experiences and ideas on climate change issues.~
Previous Waddesdon Club Retreats
The inaugural Waddesdon Club retreat was held in October 2016 with a broad focus on the importance of low-carbon industrial strategies in mobilizing capital for low-carbon investment, driving down technology costs, fostering innovation and phasing out high-emitting activities.
The second Waddesdon Club retreat took place in early 2018, with a discussion on the practical policy challenges of managing the green economy transition. Expert speakers highlighted the role of international institutions in shaping norms, policies and financial flows. Participants addressed the need for a vision that brings together poverty alleviation, tackling inequality and addressing climate change for a just transition amidst rapid decarbonisation.
Achieving the Paris Agreement’s goal of limiting temperature increases requires strong environmental leadership within economic policymaking. In October 2016, Chatham House used the unique setting of Windmill Hill to convene future leaders in finance and economy ministries from across the globe. The self styled ‘Waddesdon Club’ aimed to enhance understanding of climate and environmental challenges and ensure consideration within policy making at the highest level.
Organised by the Energy, Environment and Resources department and the International Economic departments at Chatham House the ‘retreat’ was attended by leading economic policymakers and experts from the fields of climate science, energy and finance. Utilising the inspiration of Windmill Hill, itself a celebration of the conservation and environmental work pursued by the Rothschild Foundation, attendees shared knowledge on the current political and economic context and explored the intersections of environmental and economic policymaking. With the long term aim of supporting economies to respond more effectively to global change, the event identified recommendations for future discussion.
In order to enhance attendees’ experience and support effective communication, a dedicated mobile app was developed for the event. Co-created by Chatham House and digital tool provider, Lumi, the app allowed real-time updates and feedback as well as being an on-going resource which supports the implementation of ideas discussed at the event.
Chatham House plan to continue the momentum built through the first Waddesdon Club with future events at Windmill Hill.
“Quantum Physics is a fairy tale! Like a pretty, perfect looking magical castle in the distance… but when you finally get up close and inside… it’s a dark deceptive trap that you can never escape from!… … And it’s about to collapse!”
From David Harriman’s course “The Philosophic Corruption of Physics,”…
Below is a series of lectures, David Harriman’s series, “The Philosophic Corruption of Physics/Reality.” Herein, he walks us through the history of physics and how the Kantian philosophy subverted the science.
“I love the way this guy pronounces ‘Kant’… it’s like the posh English way of pronouncing…
“Okey dokey Danny Boy!”
“Quite fitting though really!” 😀
“Anyone that knows me personally, knows I am literally like a little boy when it comes to this one! … X-Men comics! Anyone? No? GUESS WHO MY FAVOURITE X-MAN WAS!” 😀
“And still is?”
“GAMBIT! Well… I’m more inclined towards Magnetos Brotherhood these days”
In the 21st century, physicists, mathematicians and theoreticians have turned to supercomputers in their quest for the ‘Unified Theory’ of physics. Supercomputers have been used in every field from cosmology, Astro-physics, particle physics, to the search for the elusive ‘dark matter’ (which Mills has identified as ‘hydrino’). Thus far each effort (as far as I’m aware) has been based on the ‘Standard Model’ of physics, and as of yet produced no concrete results, new theories or confirmation of existing ones (I.e. Standard Model).
Why not use a supercomputer to test Mills’ GUT-CP model of atomic structure, and of the Universe? Mills has almost single handily built an entirely new model of physics, of the atom and electron… and has thus far proven to be far more accurate than the Standard Model and Quantum Mechanics. Millsian software (as discussed previously) is a perfect example of the accuracy of his model in regards to molecular structure and calculating bond energy transfers. GUT-CP has predicted everything from the accelerated expansion of the Universe to recently confirmed ‘gravitational’ waves.
One such effort in the search for ‘dark matter’, ‘new physics’ and the fundamentals of particle physics & cosmology is the GAMBIT Collaboration project. GAMBIT is The Global And Modular BSM Inference Tool, and is made up of a collection of researchers from scientific intuitions worldwide, using the supercomputer Prometheus (amongst others) in the search for dark matter and a unified theory.
GAMBIT narrows the hiding places for ‘new physics’
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences “Is it possible for today’s apparatus to detect the elementary particles of ‘new physics’ that are capable of explaining such mysteries as the nature of dark matter or the lack of symmetry between matter and antimatter? To answer this question, scientists from the international GAMBIT (Global and Modular Beyond-the-Standard-Model Inference Tool) Collaboration have developed a set of software tools that comprehensively analyse data collected during the most sophisticated contemporary experiments and measurements.” “Although almost a century has passed since Zwicky’s discovery, it has not been possible to investigate the composition of dark matter to this day, nor even to unambiguously confirm its existence. Over this time, theoreticians have constructed many extensions of the Standard Model containing particles that are to a greater or lesser extent exotic.”
The following article from Physics World explains GAMBIT, the history of the ‘discovery’ of ‘dark matter’ and the subsequent search for it’s identity’. When supercomputers go over to the dark side “Despite oodles of data and plenty of theories, we still don’t know what dark matter is. Martin White and Pat Scott describe how a new software tool called GAMBIT – run on supercomputers such as Prometheus – will test how novel theories stack up when confronted with real data”
“Unexpected scientific paradigm shifts, where reality turns out not to be as we believed, can be just as exciting and perplexing. One such dramatic change in perspective has been the dawning realization over the last few decades that “ordinary” matter accounts for just a fifth of the matter in the universe, with the rest made of a mysterious “dark” matter. Physicists love unsolved problems, and they don’t come much bigger than working out the nature of this dark stuff.
If a blockbuster movie is ever made about the discovery of dark matter, the next decade may well be the climax. New data from experiments such as CERN’s Large Hadron Collider (LHC) are telling us more about what dark matter can and cannot be, while the recent discovery of gravitational waves reminds us that even century-old theories (general relativity in this case) can be spectacularly confirmed in the blink of an eye”
GAMBIT project suggests theoretical particles are too massive for LHC detection “The idea of the GAMBIT Collaboration is to create tools for analyzing data from as many experiments as possible, from different areas of physics, and to compare them very closely with the predictions of new theories. Looking comprehensively, it is possible to narrow the search areas of new physics much faster, and over time also eliminate those models whose predictions have not been confirmed in measurements,” explains Dr. Marcin Chrzaszcz Verification of the new physics proposals takes place in the GAMBIT Collaboration as follows: Scientists choose a theoretical model and build it into the software. The program then scans the values of the main model parameters. For each set of parameters, predictions are calculated and compared to the data from the experiments. “In practice, nothing is trivial here. There are models where we have as many as 128 free parameters. Imagine scanning in a space of 128 dimensions—it’s something that kills every computer. Therefore, at the beginning, we limited ourselves to three versions of simpler supersymmetric models, known under the abbreviations CMSSM, NUHM1 and NUHM2. They have five, six and seven free parameters, respectively. But things nonetheless get complicated, because, for example, we only know some of the other parameters of the Standard Model with a certain accuracy. Therefore, they have to be treated like free parameters too, only changing to a lesser extent than the new physics parameters,” says Dr. Chrzaszcz.
The GAMBIT Project website gives the following explanation – “Welcome to the GAMBIT homepage. GAMBIT is a global fitting code for generic Beyond the Standard Model theories, designed to allow fast and easy definition of new models, observables, likelihoods, scanners and backend physics codes.”
Thus far, ALL previous efforts with GAMBIT and Prometheus have led to dead ends (like all of modern QM physics), but it is worth noting that all previous efforts using Prometheus have been built upon or have expanded upon THE STANDARD MODEL!
What if this code, or at least these supercomputers where used for Mills’ GUT-CP and hydrino model? Entering a theory based upon Classical Laws? Expanding upon Newton, Maxwell equations… essentially used to enter ALL of Mills 30 year research, theories, predictions and equations… tested against known observables in the Universe?
I believe the results of doing so would be extraordinary accurate compared to all previous attempts, and would be the beginning of creating a computer model/simulation of the Universe to such a degree of accuracy never before witnessed in mathematics, particle physics, Astro-physics and cosmology… essentially the beginning of creating a computer simulation of the history of the Universe, past, previous ;o, present and future… A COMPUTER SIMUALTION OF THE HISTORY OF THE UNIVERSE!
(as well as confirming hydrino as dark matter, the expansion/contraction of the Universe, gravitational waves and causes etc.)
Surely if all these Quantum Physicists, Astro-physicists and cosmologists are saying Mills is fundamentally wrong… take a risk and run it on a supercomputer such as Prometheus (fitting name considering!)… take a GAMBIT!
“The intro to the awsomest cartoon ever! I’m going to walk down the aisle to this theme!”
“The application of unlimited, unbounded energy are only constrained by human imagination, ingenuity and ambition” – Dr Randell Mills
Brilliant Light Power Terraforming Application Video
(The music! 😀 … No dolphins? 😦 )
“We need to start terraforming the planet in a positive and beneficial way for the sake of the survival of our species and all life on Earth… because at the moment we’re terraforming Earth like General Zod!… Take California for example, severe drought and water shortages, catastrophic wild fires… … although when those fires reached Rupert Murdoch’s home, I was privately thinking ‘swings and roundabouts!’… … but we could create a global paradise of abundance if we utilised this technology correctly”
Hydrino energy is high density, cheap, non- polluting, safe, and deployable anywhere in the world…
Obviously Brilliant Light Power and hydrino energy has the potential to eradicate the burning of ALL fossil fuels globally, thereby cutting global carbon emissions to ZERO!
Hydrino technology can also replace ‘renewable’ energy sources such as solar and wind farms, which would potentially take up millions of square miles of the Earths landmass to host. There is no radioactive or chemical waste, or any pollution of any kind.
But also Dr Mills and Brilliant Light Power envision a world where the use of ‘hydrino’ technologies will be used to transform the planets most inhospitable wastelands and arid regions into ‘lush, liveable, crop-producing expanses’ whilst ‘also preventing drought in already cultivated areas such as California.’
A recent article unveiled plans by scientists to transform areas of the Sahara into a lush rainfall region with abundant greenery. Although the plan utilises a combination of wind turbines and solar panels, it highlights the possible ways in which future clean technologies will be used to positively alter and transform the Earths landscape.
I believe much more research should be conducted and careful consideration taken into the idea of transforming certain regions of the Earth, using hydrino power only. As pointed out in following article, transforming such areas as the Sahara may have unintended consequences for other regions such as the Amazon rainforest, and Atlantic ocean marine life… 8 Craziest Mega-Engineering Projects That Could Rework Earth “For one, sand from the Sahara is carried into the air, across the Atlantic, and deposited in South America. The rich dust that falls from the sky, and the rain storms caused by that dust picking up moisture during it’s transoceanic journey both fertilize the Amazon rain forest. No desert, no dust. No dust, no rain forest. During that journey, the dust also feeds a variety of sea life.”
Brett Holverstott mentions in his talk, in regards to how hydrino energy has the potential to help alleviate, if not eradicate :-
Ice cap melting
The dying of marine life
Being deployable in the third world
The cutting down of the Amazon rainforest
Global city smog
Construction of river dams
… amongst numerous other potential environmental benefits.
I also believe our new understanding of atomic structure and molecular physics (Millsian), could pave the way for new cleaner technologies and sustainable materials in industries other than energy. These may include :-
– Plastics and materials
– Food and crop growth
– Fertilisers and agriculture
– Mining and mineral extraction
– Methods for cleaning up past environmental damage (plastics, carbon emissions, nuclear and toxic waste), essentially reversing the damage already caused by the Industrial Revolution.
“The application of unlimited, unbounded energy are only constrained by human imagination, ingenuity and ambition” – Dr Randell Mills
“Before we venture out into the stars, looking to terraform other planets… maybe we should take a look back at our own and try and reverse the damage we’ve done here” – Danny Hurley
Robert Senftleben: Terraforming Planet Earth
Large parts of the surface of our planet have been devastated by human activity. Terraforming on a human scale is needed to bring these landscapes back to life. The knowledge and technology is there, and you can learn how to use it and participate.
EXCELLENT TED TALK!
How to green the world’s deserts and reverse climate change | Allan Savory
“Desertification is a fancy word for land that is turning to desert,” begins Allan Savory in this quietly powerful talk. And terrifyingly, it’s happening to about two-thirds of the world’s grasslands, accelerating climate change and causing traditional grazing societies to descend into social chaos. Savory has devoted his life to stopping it. He now believes — and his work so far shows — that a surprising factor can protect grasslands and even reclaim degraded land that was once desert.
Update (13/09/2-18) ;D Gaia 2.0 (Timothy M. Lenton, Bruno Latour – University of Exeter)
“A time-honoured theory into why conditions on Earth have remained stable enough for life to evolve over billions of years has been given a new, innovative twist.
For around half a century, the ‘Gaia’ hypothesis has provided a unique way of understanding how life has persisted on Earth. It champions the idea that living organisms and their inorganic surroundings evolved together as a single, self-regulating system that has kept the planet habitable for life—despite threats such as a brightening Sun, volcanoes and meteorite strikes. However, Professor Tim Lenton from the University of Exeter and famed French sociologist of science Professor Bruno Latour are now arguing that humans have the potential to ‘upgrade’ this planetary operating system to create “Gaia 2.0”. They believe that the evolution of both humans and their technology could add a new level of “self-awareness” to Earth’s self-regulation, which is at the heart of the original Gaia theory. As humans become more aware of the global consequences of their actions, including climate change, a new kind of deliberate self-regulation becomes possible where we limit our impacts on the planet. Professors Lenton and Latour suggest that this “conscience choice” to self-regulate introduces a “fundamental new state of Gaia—which could help us achieve greater global sustainability in the future. However, such self-aware self-regulation relies on our ability to continually monitor and model the state of the planet and our effects upon it. Professor Lenton, Director of Exeter’s new Global Systems Institute, said: “If we are to create a better world for the growing human population this century then we need to regulate our impacts on our life support-system, and deliberately create a more circular economy that relies—like the biosphere—on the recycling of materials powered by sustainable energy.” The original Gaia Theory was developed in the late 1960’s by James Lovelock, a British scientist and inventor. It suggested that both the organic and inorganic components of Earth evolved together as one single, self-regulating system which can control global temperature and atmospheric composition to maintain its own habitability. The new perspective article is published in leading journal Science on September 14, 2018. It follows recent research, led by Professor Lenton, which offered a fresh solution to how the Gaia hypothesis works in real terms: Stability comes from “sequential selection” in which situations where life destabilises the environment tend to be short-lived and result in further change until a stable situation emerges, which then tends to persist. Once this happens, the system has more time to acquire further properties that help to stabilise and maintain it—a process known as “selection by survival alone”. Creating transformative solutions to the global changes that humans are now causing is a key focus of the University of Exeter’s new Global Systems Institute.”
“I actually emailed Roger Penrose with this last year.”
“Did you get a response?”
“No… but then he is ‘Sir’ Roger Penrose, and what we know of the Queens Knighthood list, is it reads like a God damn sex offen..
“Drop it Danny Boy!”
Recent observations made by both the Planck observatory and the BICEP2 South Pole telescope, indicate possible remnants of a previous Universe. This possible indication has been interpreted by a number of physicists in different ways, but Roger Penrose of Oxford University believes what is being seen in the Cosmic Microwave Background (CMB) data, are radioactive swirls dubbed ‘Hawking Holes’, thus being proof of a previous Universe existing prior to this present one, in what he and his colleagues call the “conformal cyclic cosmology” (CCC).
(It is worth noting these conclusions are drawn mainly from the data from the Planck observatory, and raw data from the BICEP2 is still to be released)
Radioactive swirls in the cosmos may rewrite the origin story of the universe “The idea is called “conformal cyclic cosmology” (CCC), and what it asserts is that, rather than starting from a big bang, the universe continually expands and contracts, each time leaving behind tiny bits of electromagnetic radiation that remain as the process occurs over and over. The late Stephen Hawking predicted tiny dots of radiation, which others call ‘Hawking points’, left over from this cycle.”
These Swirls of Light Could Be Signs of a Previous Universe Existing Before Ours “Penrose’s CCC model was developed as an answer to a curious imbalance between measurements of our early Universe’s temperature and the state of order we might expect. According to him, this imbalance could be accounted for by the death of a pre-existing universe that was there before the Big Bang. Oscillating universes come in a few different forms, depending on your choice of model. Some suggest the Universe is destined to fall back in itself one day.”
This observation, and it’s interpretation caught my attention, because according to Mills’ GUT-CP, we live in an ‘Oscillating Universe’, eternally expanding and contracting over a period of a trillion years or so (give or take). According to Mills, the Universe is in a continual state of expansion from 9 billion light years to 312 billion light years, and then contraction phase back (every 450 billion years). This process happens because, during expansion phase (or ‘annihilation’), matter is converted directly into energy (through various ‘hydrino’ chemical and nuclear processes throughout the Universe, including our own Sun), which causes spacetime to expand everywhere throughout the Universe.
NOTE – Dr Randell Mills successfully predicted the acceleration of the expanding Universe in his model prior to it’s discovery. He was also the first to successfully predict gravitational waves in to his model.
After the Universe has expanded to it’s peak radius (312 billion light years), and the engines of the expansion phase have ‘run out’ so to speak (stars, supernovas, neutron stars etc.), and most of the Universes matter has been converted into energy… the Universe will then begin it’s contraction phase and the process is reversed. Radiation will create particles, which in turn create atoms, converting ‘dark energy’ into matter, thus contracting spacetime everywhere throughout the Universe.
“The conversion of matter into energy causes spacetime, and thus the universe, to expand, since light has inertial but no gravitational mass. The acceleration of the expansion of the presently observed universe was predicted by Mills in 1995 and has since been confirmed experimentally. Mills predicts that the universe expands and contracts over thousand-billion year cycles.” – Brilliant Light Power
It is important to note that The Big Bang Theory is just that… a theory, and no direct evidence for it has ever been put forward. The idea was based upon the fact that the Universe was expanding (prior to knowledge of it’s acceleration, which should in turn discount the theory). Many physicists and cosmologists throughout the 20th century questioned the theory, and by early 21st century a vast array of evidence has slowly accumulated to discount it.
– Reiss 1998. Hubble date showed that the Universe was NOT decelerating as predicted by the Big Bang Theory, but actually accelerating. Unknown to most in science, Mills had successfully predicted this two years prior in GUT-CP model. (this is when the idea of ‘dark’ matter’ came to the forefront of physics in order to account for this surprising observation).
– Space Circles Are Proof of a Pre-Big Bang Universe? (2010) Recycled-universe theory “works on paper,” but details missing, critics say.– 3 Theories That Might Blow Up the Big Bang (Steinhardt and Turok) “Steinhardt and Turok—working closely with a few like-minded colleagues—have now developed these insights into a thorough alternative to the prevailing, Genesis-like view of cosmology. According to the Big Bang theory, the whole universe emerged during a single moment some 13.7 billion years ago. In the competing theory, our universe generates and regenerates itself in an endless cycle of creation. The latest version of the cyclic model even matches key pieces of observational evidence supporting the older view.”
“We weren’t looking for cycles,” Steinhardt says, “but the model naturally produces them.” After a collision, energy gives rise to matter in the brane worlds. The matter then evolves into the kind of universe we know: galaxies, stars, planets, the works. Space within the branes expands, and at first the distance between the branes (in the bulk) grows too. When the brane worlds expand so much that their space is nearly empty, however, attractive forces between the branes draw the world-sheets together again. A new collision occurs, and a new cycle of creation begins. In this model, each round of existence—each cycle from one collision to the next—stretches about a trillion years. By that reckoning, our universe is still in its infancy, being only 0.1 percent of the way through the current cycle.The cyclic universe directly solves the problem of before. With an infinity of Big Bangs, time stretches into forever in both directions. “The Big Bang was not the beginning of space and time,” Steinhardt says. “There was a before, and before matters because it leaves an imprint on what happens in the next cycle.”
– As Holverstott states in ‘Randell Mills and the search for Hydrino energy’, numerous ‘ancient’ structures are being discovered throughout the Cosmos that seem to predate the accepted 13.6 billion years ‘beginning’.
~Including a ‘quasar that is 13 billion light years away, yet powered by a black hole about 2 billion times the mass of the Sun’ (Mortlock 2011).
~A star smaller than our own Sun, which has almost no trace of elements heavier than hydrogen or helium, with a ratio of helium lower than that theoretically created in the big bang. Dubbed ‘The Star That Should Not Exist’(which is so sweet! :D)
– More recently a gargantuan black hole found in 2013, again throws doubt upon the notion of nothing in our ‘known’ Universe existing prior. I believe many more objects and structures will be found in the coming years and decades that will support Mils’ GUT-CP and his ‘Oscillating Universe’ Model.
– Young black hole had monstrous growth spurt Super-massive object found in early Universe tests theories of cosmic evolution. “A black hole that grew to gargantuan size in the Universe’s first billion years is by far the largest yet spotted from such an early date, researchers have announced. The object, discovered by astronomers in 2013, is 12 billion times as massive as the Sun, and six times greater than its largest-known contemporaries. Its existence poses a challenge for theories of the evolution of black holes, stars and galaxies, astronomers say.”
Mills model of an Oscillating Universe is NOT to be confused with other models such as the Big Bounce or CCC, which are still based in Quantum Models such as string or M theory… this is a truly original, more eloquent and simpler model, that has been arrived at through a classical understanding of atomic structure and gravitational forces (i.e hydrino model).
For further details see… Summary Of Randell Mills’s Unified Theory (Holverstott)
A potentially paradigm-shifting technology has been under development at an R&D firm in NJ called Brilliant Light Power. For people monitoring the situation, the question currently is about the status of commercialization. It is not a publicly held firm, but is in mid-stages of private equity capitalization in the range of $100-120M.
I recently read a book titled “Randall Mills and the Search for Hydrino Energy“, offering a detailed and compelling history of the development of this novel renewable energy technology, authored by an insider, an intern who stayed on to work there for several years (published in 2016, with company data as of end of 2015). In order to provide some context, this article will summarize the concept, breakthrough achievements, compare its levelized costs to other generation technologies, offer a brief review of validation efforts, and touch on personnel and capitalization. I will try to be faithful to information presented in the book and website materials, and will try identify my own cautious opinions in context.
The technology was developed by Randall Mills, whose special talents manifested while still a graduate student in physics at Harvard, when he made a discovery in 1989 while exploring a foundational question in physics about why an orbiting electron did not radiate away its energy. Quantum mechanics diverged from classical mechanics without ever answering this question. Mills emerged with a revised classical theory that included the proposition that hydrogen’s ground state can in fact be lower than previously thought, that it can have fractional ground states.
According to Mills’ theory, hydrogen can react with a catalyst in a 2-step process, in which first a small amount of energy is transferred by a process called resonant inductive coupling, in integer increments of 27.2 eV. When this photon is accepted by the catalyst from the atomic hydrogen, the hydrogen electron then becomes unstable and will decay into a lower, fractional orbital, closer to the nucleus. This 2nd step releases a larger increment of energy than would be predicted by any other known chemical reactions, 200x higher than burning hydrogen. The ending species of hydrogen was dubbed a “hydrino”.
Mills documents extensive experimental confirmation, which to date has identified hydrino states of 1/2, 1/3, 1/4, down to 1/10 (orbital shell distance below 1.0 ground state). The theory calculates possible hydrinos with a theoretical limit of 1/137, constrained by relativistic speeds for the electron travel.
The book documents Mills’ 25-year journey of verification, both with collaborators and validation by a growing number of independent investigators who report finding confirmation in a wide range of experimental configurations. Mills was prolific in publishing his findings in the face of persistent resistance from establishment figures, in progressively more prestigious journals. The reference section of the book documents 96 journal articles with Mills as primary author(as of late 2015, now over 100), 52 journal articles with non-BLP primary authors, and 31 other technical reports regarding hydrino research by various universities, national labs and corporations.
Early lab set ups involved low temperature electrolytic cells, but Mills eventually found that the phenomenon could be triggered and measured more successfully in high temperature plasma conditions. Subsequently, tests were constructed using various types of instrumentation, & validated by leading experimentalists in this field, experts in thermal measurement. A summary of the full extent of the verification data discussed in the book and website materials is beyond the scope of this article. But it’s worth including one slide which shows a list of 29 types of confirming evidence that has been compiled to date, including 7 or 8 types of spectroscopy and 4 or 5 types of calorimetry.
On the website, most of the Validation Reports are compiled under the technology tab and also under the News/ What’s New tab. The focus seems to be weighted most heavily on confirming that the energy is generated by the hydrino reaction process. Business presentations pdf’s, PowerPoints & videos of conference are in the News / Archive tab. The validation page reports 4 independent studies in 2020, 3 in 2019, 5 in 2016, and an additional 17 earlier reports.
The specific excess heat generated is not documented uniformly within a single reference system. As I searched to compile these results, I found various expressions of “gain” cited in PowerPoint slides reporting outcomes from a range of experiments, as follows:
“energy gain of 200-500x”
“Optical energy output of 30x input”
In a table identifying specific experiments showing a gain column, with 3 cases with highest values showing 399x, 279x & 213x
“peak power 20MW, time-avg power 4.6MW, optical emission energy 250x applied energy”
“input power 6.68 kW, output 1,260 kW” 1260/6.68 = 188x
in terms of power density, as “20MW in microliters”, and elsewhere “billions of watts per liter”
the 2020 validation studies report finding that hydrino plasma produced excess power of 275kW, 340kW, 200kW & 300kW respectively.
2019 report power levels of 1000kW & 100kW.
2016 studies report 514kW of optical power & 1.3MW peak power; 689kW with 28x gain; thermal power levels of 440kW; & 1.5MW continuous power from 8.6kW input (1500/8.6 = 174x)
It would benefit the company to clarify and reconcile these value, especially when differing by orders of magnitude, ie., in ranges of 10x vs 100x. This would help to make clear the specific the relationship between these output values and the resulting dramatic reductions in cost of energy production per kW, which are discussed further below.
The experimental configurations evolved from demonstrating the effect in single shot events, to systems that could sustain continuous reactions and maintain a stable plasma. These early events in which a target material was bombarded by a catalyst along with a high current, low voltage electrical discharge to create the plasma conditions, resulted in an excess of energy so hot that even electrodes made of tungsten were vaporized.
The next steps involved engineering design to develop a commercial prototype, and optimization of supporting systems. The most challenging practical problem was designing an electrode that could withstand the high temperatures. This was solved by making the electrode entirely liquid, an arcing molten metallic silver electrode with a continuous feed, into which the catalyst was mixed, which enabled a continuous plasma reaction. The reaction took place in a small containment vessel, with the two feeder systems, one for the liquid silver, the other for moving the atomic oxygen and hydrogen in and hydrinos out. The plasma is maintained at 4000C and generated very high energy photonic radiation in the Extreme Ultra Violet frequency range (EUV), producing excess heat and molecular signatures confirming Hydrino profiles. Supporting systems were engineered for hydrolysis of the water, for induction pumping the silver, for heat transfer systems, and for electrical offtake.
The system was branded the “SunCell”.
The reaction produces no emissions other than the reduced hydrinos, which are 64x smaller volume than ordinary hydrogen. Current design captures the hydrino gas in a charcoal trap or a milled halide hydroxide crystalline matrix to which the hydrinos can bind. If exhausted into the air, it is inert, non-toxic, lighter than helium and would rise to the upper atmosphere.
The power from the plasma can be utilized either directly as heat with heat exchangers or can be converted to electricity by means of two distinct offtake technology configurations, that were developed and patented:
Concentrating Photovoltaics (CPV) – the EUV can be converted by stepping down the frequency to the visible spectrum by means of “blackbody radiation”. The containment shell is made of refractory materials to optimize this conversion to optical energy which can then be captured with concentrating photovoltaic cells arrayed around the blackbody. The containment sphere is in essence like the filament of a light bulb, capturing multiple suns 24 hours a day, without intermittency.
MagnetoHydroDynamic (MHD) – the plasma heats an expanding gas seeded with conducting silver nanoparticles is passed through a transverse magnetic field, converting kinetic energy to electricity.
The more detailed engineering diagram of the SunCell PV design gives a better sense of the relatively compact scale of the device, in this instance only about 3ft high from the base platform.
The device has very high power densities, can produce continuous power at 20MW/ liter. Below is a working demonstrator prototype in 2016.
To illustrate the comparative power density of the SunCell compared to other stationary concentrating solar applications, they show this slide:Costs
Costs are low because the capital costs to construct the devices are low, one estimate was $60/kW, which is less than 2% of capital costs for solar. Other operating costs are negligible, for maintenance & fuel, because other than the hydrogen fuel, which is derived from water, all the other materials, recycle within the device, and with few if any moving parts, and so can be expected to have life cycles of 20 years or more. The resulting energy costs are estimated at $.01/kWh, substantially lower than any other source.In business presentations from 2016, BLP made an attempt to provide more conservative comparisons using the Levelized Cost of Energy tables provided annually from asset manager Lazard considered to be the most reliable & comprehensive surveys are available. Using the most recent report published 11/7/19, BKP places their LCOE in this context, projecting costs at approximately 50% below the cost of solar & 30% less than Gas Combined Cycle.
With such low operating & capital costs, the revenue model is based on a flat per diem energy lease transaction rather than a metered price per kWh. Revenue is modeled based on a “breakthrough rate” below $.05/kWh, which is an arbitrary price sufficiently below market prices of competing sources, but with an enormous built in margin. Most of the pricing would be based on off-grid provisioning, rather than through the wholesale market auctions through ISOs and other grid operators. Hence, the capex & operating costs would represent approximately only 2-5%, with net earnings above 90%.
Costs will improve at scale, as the largest costs are for the CPV components. At production rates of 10GW annually, the estimated costs of the CPV cells are $32 per kW at concentration of 2000 Suns. A cost analysis for parts for a production model of the 2000 suns version show the PV cell assembly constitutes 60%, or $15,000 out of the total of $25,000. But at higher temperature plasmas, at 10,000 Suns concentration, optimizing output efficiencies, CPV costs drop to less than $6 per kW, or $2800, down to 23%.
Status of Commercialization
If the experimental validation data is accepted, and the resulting production cost calculations are supportable, the pressing question is: what is the status of commercialization? Why are we not seeing some of these devices appearing in the market yet? What is holding up the show? The book doesn’t get into this issue, although the author has communicated his intent to update with a 2nd edition that explains how this next phase has evolved since 2016.
The website unfortunately does not provide an easily accessible section featuring a sequential history of the commercialization status, either in a front page or a top line item in a dropdown menu, or a side bar or a featured story. However, digging deeper, comparing earlier & later website materials, the narrative can be reconstructed. The two main sources are Business Plan pdfs, & Demonstration Days, found under different tabs.
Business Plan pdf’s:
Earlier commercialization plans indicate the first target market will be industrial thermal energy users. The SunCell operates 3x more efficiently & 2.5x lower costs if the end product is process heat only, and the electric conversion phase of the system is not included. BLP envisioned the rollout timeline as shown below, as of 6/14/19. In Phase 1, after industrial users, commercial & residential thermal users are targeted next. Heat for high GHG generators, steel & concrete are targeted later presumably because those industries are more resistant to change. Phase 2 targets electricity markets, initially with the SunCell Photovoltaic design scaled to 10kW – 150kW. The next target would be scaled to 250kW – 2MW, to address Distributed Energy Resources (DERs) for industrial, commercial and multi-tenant residential buildings, providing micro-grid power that can be “islanded” from grid connection, simplifying system designs to eliminate the need for battery storage systems, and eliminating the utility connection costs & queueing time delays. SunCells can operate continuously, but can also be taken off line without curtailment or the need to redirect current to storage. They can be simply shutting down with smart controls to smooth peaking and manage very short ramping & re-start times. Multiple SunCells can be networked with low voltage private grid interconnections, minimizing the need to even interface with the public grid, reducing complications associated with utility permitting. Further, the potential for micro-grid configurations in rural applications could offer solutions to the wildfire risks in California.
Phase 3 addresses transportation applications in a subsequent phase, for trains, large-scale marine (transport ships that currently burn high emission bunker oil), buses & trucks, and ultimately passenger vehicles and electric aviation. The MHD version can be scaled down for light vehicles to a size much smaller than either internal combustion engines or EV batteries.
Demonstration Days, found under the News/Archive tab, includes 6 videos of roadshow presentations, with slides, from 1/28/14 – 10/26/16, and 4 additional presentations actually called “Roadshows” (although it is not clear that any of the roadshows are intended to be investor pitches).
Information most relevant to the status of commercialization were a) presentations by two contracted engineering firms, and b) reference in one of the last Demo Day pdfs to a new set of contractors.
Columbia Tech which is a mid-sized management firm in Boston, not a GE or Siemens, but does $200M/yr revenues, has 500 employees, was selected by BLP to manage transitional processes moving from the development engineering being done at BLP to the production engineering which may be further farmed out. They presented slides indicating where it thinks BLP is in the process.
This is a nice schematic infographic, but there was little in the content of the presenter’s material that disclosed that CT had actually started doing any work, or that there was an expected date for BLP to begin handing off tasks for CT to execute on its path to production development. Later in that same Demo Day, the in-house marketing director showed his own similar schematic, which added some detail but no new information about actual developments.
Masimo (formerly Spire, a PV manufacturer) contracted to develop a custom CPV system. However, Masimo also has disappeared, no further reference to either progress on their assigned contract, or that they are even still an industrial partner. Instead, in some later pdf slides, there is indication in some indication that BLP has reconsidered using non-concentrating PV, that they have been making a closer cost benefit analysis.
In the last roadshow pdf 9/12/17, slides #42 – #47 indicated new progress:
TMI Climate Solutions (subsidiary of MiTek, a Berkshire-Hathaway company) appears to have been engaged to develop designs for boilers to offtake heat for thermal applications;
Re Columbia Tech, they announce: “SunCell Commercialization engineering is mature enough to be outsourced to CT. Equipment is being fabricated, procured, shipped”. It seems to be associated with updated injector design solutions. Despite this promising indication, there was no further updates about CT after this report.
PV development progress: indicates changes in design parameters, & perhaps a change from Masimo to SpectroLab (a Boeing company) to complete the development of the triple junction concentrator cells.
6/14/19 is the most recent update in a Business Presentation pdf. However, the material merely refined prior messaging, with some updates of prototyping and engineering solutions of SunCell system components, some new validation experiments conducted by independent scientists, and another review of 17 out of the 29 methods for verification of the Hydrino explanation. However, there were no further updates from ColumbiaTech, Spectrolab or Masimo, TMI Climate Solutions, or any other development partner about component status or overall system fabrication design status.
Advisory Board: Most have relevant experience in renewable energy development, seem to be well chosen to facilitate the development goals, and some have very high level backgrounds, such as James Woolsey, former Director of the CIA. This is at least a hopeful indicator that people with both management talent and influence consider the technology to have potential, & whose presence would tend to exert pressure for development progress.
$100 -120m of investment capital is mentioned in scattered references, all of which is from private equity offerings, but investors are NOT disclosed anywhere in the website. In another reference, there was an indication that some of the other investors were utilities, including a rural electric coop in NM, which may have participated by placing pre-orders rather than taking equity.
The Wikipedia page, which is very one-sided & antagonistic, states: “…Investors include PacifiCorp, Conectiv, retired executives from Morgan Stanley and several former BLP board members:
Shelby Brewer who was the top nuclear official for the Reagan Administration and CEO of ABB-Combustion Engineering Nuclear Power ,
With so much potential for triggering transformation with a technology that leaps forward in efficiency & costs and significantly reduces GHG emissions both in fabrication and operation, one can only hope that Brilliant Light Power will be able to accelerate their commercial development process, and upgrade their website to be able to make updates more transparent and accessible.