Uncategorized

Israel… GUT-CP will make you into the technological powerhouse of the 21st century… THE TECHNOLOGICAL SUPERPOWER!

The British have essentially had a complete mental breakdown because of this!
(they couldn’t handle the power of the Sun). 

No offence Israel, but I’m coming to Tel Aviv… to Jerusalem, and I’m going to get Biblical on your arses! 😀

“For the first time in human history, the power of the Sun has been brought down to the surface of planet Earth”
“The most important discovery since fire”


grand

I have NEVER been so serious of anything in my life… Israel, take all those Quantum Physics books and (don’t burn them) but put them to one side for a moment… take physics back one hundred years and start a fresh… read Randell Mills’ work… read The Grand Unified Theory of Classical Physics… study it in depth, from it’s inception, it’s history, it’s continual evolution… the background to hydrino energy… the implications it has in EVERY of discipline of science (and philosophy).

This discovery is the greatest in recent human history… anyone that grasps it, and understands it, masters it, can propel technology and science forward in ways we can not yet imagine.

EVERYTHING from energy, physics, cosmology, chemistry, genetics, molecular biology, drug development, ‘anti-gravity’, synthetic materials, electronics… there is NOTHING this is not going to effect… aviation, space exploration, mineral extraction, medicine…

I’ve exhausted myself… I don’t know what else to say. Israel… apply your best and brightest to this, whilst everyone else is pursuing ‘Quantum’… you will emerge as the technological powerhouse of the next hundred years… a technological superpower… unrivalled and untouchable.
(sorry to those that disagree but there are military applications, I imagine Israel is already thinking the same!)

It’s the greatest gift you have since… … the Ark of the Covenant! 😀
(you can build yourself a new one! A 21st century Ark of the Covenant!)

grand

http://www.brettholverstott.com/

https://brilliantlightpower.com/

http://philosophystorm.org/koroeda

https://www.infinite-energy.com/iemagazine/issue130/WallIE130.pdf

https://www.infinite-energy.com/iemagazine/issue131/WallIE131Part2.pdf

http://www.infinite-energy.com/iemagazine/issue142/WallIE142.pdf

http://www.blacklightpower.com/wp-content/uploads/pdf/Natutech.nl_Article.pdf

https://vimeo.com/user26477140/videos

http://webcast.massey.ac.nz/Mediasite/Play/8ef7e03e26fc458b8eb7f351738f26811d

https://www.millsian.com/

https://fcnp.com/2018/08/17/great-energy-transition-fires-floods-fossil-fuels-new-energy/

https://www.villagevoice.com/tag/randell-mills/

https://www.infinite-energy.com/images/pdfs/RosenblumIE17.pdf

http://www.cheniere.org/misc/mills.htm

https://cen.acs.org/articles/94/i44/Cold-fusion-died-25-years.html

https://cognitivecarbonspot.wordpress.com/

http://pubs.acs.org/subscribe/archive/ci/31/i10/html/10vp.html

https://www.researchgate.net/topic/hydrino

https://fcnp.com/2018/12/20/great-transition-progress-new-sources-energy/

Other inventions/patents from Dr Randell Mills ranging from genetic sequencing to selective drug delivery to Resonant magnetic susceptibility imagine (ReMSI)… the guys a genius!…
https://www.google.com/search?q=randell+mills

Climate Change, energy, Environment, Global Warming, technology

Energy Transitions 2019 – New Actors, New Technologies, New Business Models -March 18, Chatham House, London (THE CHATHAM HOUSE RULE!), The Rothschild Foundation.

“When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed.”

£1302.00 … for a one day conference! No wonder these people rule the world!

chatham

Energy Transitions 2019

Leadership in a climate of disruptive change
18 March 2019 – 9:30am 5:30pm
Chatham House, London

Conference
Energy Transitions 2019
New Actors, New Technologies, New Business Models
18 March 2019 – 9:30am to 5:30pm
Chatham House, London

Overview

A global shift in the energy sector is under way with the rise of renewable energy sources spearheaded by their dominance of investment in the power sector. This is leading to disruptive change as the greater deployment of renewables and many associated technologies, such as storage, are challenging existing business models and threatening the market dominance of the existing actors. At the same time investment in fossil fuels has stabilized, as a slowdown of the financing of coal has been balanced by modest increases in spending in upstream oil and gas.

New global trends, electrification of new sectors such as transport and heating, along with the provision of modern energy services to over a billion people lacking access could further disrupt the energy sector, and the future impacts of these transitions on global energy security and sustainable transitions globally remain unclear.

Therefore, now, more than ever, it is critical that policy-makers and business leaders re-evaluate current and future strategies for delivering the domestic and international energy transition. The fourth annual Chatham House Energy Transitions conference will examine the new drivers of change, focusing on how different economies and industries can make the shift to a low-carbon energy future. Key questions to be explored include:

  • What will incentivize an acceleration in decarbonization and drive low-carbon innovation?
  • How can new technologies be deployed to transform grid interaction and enhance connectivity?
  • What are the implications of the changing policy environment for low-carbon investment?
  • How do disruptive shifts in the energy sector affect the prospects for enhancing access to clean, safe and sustainable energy in developing countries?

The Chatham House Rule
To enable as open a debate as possible, this conference will be held under the Chatham House Rule.

chatham1

Chatham_House.jpg
“Have I applied?… maybe!” 😀

The Waddesdon Club: Mainstreaming Climate in Finance and Economic Decision-making

Achieving the Paris Agreement’s goal of limiting temperature increases to ‘well below’ 2°C requires environmental leadership to rapidly emerge within the world’s centres of economic policymaking: treasuries, finance ministries and ministries of economy and business.
Waddesdon-Club-2018-06-28
Plenary Session at the Waddesdon Club 2018 annual meeting

The urgency of climate change dictates that the next generation of leaders must deliver the economic transformation needed; these individuals need to understand how climate and environment challenges will affect their time in power and define their legacies.
Our Work

The Waddesdon Club is Chatham House’s response – through engaging future leaders, it seeks to equip them with the necessary tools, concepts, language, and capacities for influence needed to advance a mainstream economic agenda for climate change and sustainable development. Core to this approach is an annual retreat at Waddesdon Manor, offering a unique opportunity for participants to deepen their knowledge; widen their peer network, including meeting leading international experts; and share their respective perceptions, experiences and ideas on climate change issues.~

Previous Waddesdon Club Retreats

The inaugural Waddesdon Club retreat was held in October 2016 with a broad focus on the importance of low-carbon industrial strategies in mobilizing capital for low-carbon investment, driving down technology costs, fostering innovation and phasing out high-emitting activities.

The second Waddesdon Club retreat took place in early 2018, with a discussion on the practical policy challenges of managing the green economy transition. Expert speakers highlighted the role of international institutions in shaping norms, policies and financial flows. Participants addressed the need for a vision that brings together poverty alleviation, tackling inequality and addressing climate change for a just transition amidst rapid decarbonisation.

ROTHSCHILD FOUNDATION
waddesdon1

Chatham House ‘Waddeson Club’ at Windmill Hill

Achieving the Paris Agreement’s goal of limiting temperature increases requires strong environmental leadership within economic policymaking. In October 2016, Chatham House used the unique setting of Windmill Hill to convene future leaders in finance and economy ministries from across the globe. The self styled ‘Waddesdon Club’ aimed to enhance understanding of climate and environmental challenges and ensure consideration within policy making at the highest level.
waddesdon2

Organised by the Energy, Environment and Resources department and the International Economic departments at Chatham House the ‘retreat’ was attended by leading economic policymakers and experts from the fields of climate science, energy and finance. Utilising the inspiration of Windmill Hill, itself a celebration of the conservation and environmental work pursued by the Rothschild Foundation, attendees shared knowledge on the current political and economic context and explored the intersections of environmental and economic policymaking. With the long term aim of supporting economies to respond more effectively to global change, the event identified recommendations for future discussion.

In order to enhance attendees’ experience and support effective communication, a dedicated mobile app was developed for the event. Co-created by Chatham House and digital tool provider, Lumi, the app allowed real-time updates and feedback as well as being an on-going resource which supports the implementation of ideas discussed at the event.
Chatham House plan to continue the momentum built through the first Waddesdon Club with future events at Windmill Hill.

Philosophy, physics, quantum physics

The Quantum Physics Fairy Tale (Tales from Immanuel Kant)

“Quantum Physics is a fairy tale! Like a pretty, perfect looking magical castle in the distance… but when you finally get up close and inside… it’s a dark deceptive trap that you can never escape from!… … And it’s about to collapse!”

From David Harriman’s course “The Philosophic Corruption of Physics,”…

Below is a series of lectures, David Harriman’s series, “The Philosophic Corruption of Physics/Reality.” Herein, he walks us through the history of physics and how the Kantian philosophy subverted the science.

“I love the way this guy pronounces ‘Kant’… it’s like the posh English way of pronouncing…
“Okey dokey Danny Boy!”
“Quite fitting though really!” 😀





astro-physics, cosmology, GUT-CP, hydrino, particle physics

GAMBIT, Prometheus… (confirming GUT-CP using supercomputers?… unlocking the secrets of the Universe?)

“Anyone that knows me personally, knows I am literally like a little boy when it comes to this one! … X-Men comics! Anyone? No? GUESS WHO MY FAVOURITE X-MAN WAS!” 😀
“And still is?”
“GAMBIT! Well… I’m more inclined towards Magnetos Brotherhood these days”

gambit1

In the 21st century, physicists, mathematicians and theoreticians have turned to supercomputers in their quest for the ‘Unified Theory’ of physics. Supercomputers have been used in every field from cosmology, Astro-physics, particle physics, to the search for the elusive ‘dark matter’ (which Mills has identified as ‘hydrino’). Thus far each effort (as far as I’m aware) has been based on the ‘Standard Model’ of physics, and as of yet produced no concrete results, new theories or confirmation of existing ones (I.e. Standard Model).

Supercomputer Confirms Standard Model Theory Of The Universe, Mystery Deepens (eh?)

Why not use a supercomputer to test Mills’ GUT-CP model of atomic structure, and of the Universe? Mills has almost single handily built an entirely new model of physics, of the atom and electron… and has thus far proven to be far more accurate than the Standard Model and Quantum Mechanics.
Millsian software (as discussed previously) is a perfect example of the accuracy of his model in regards to molecular structure and calculating bond energy transfers. GUT-CP has predicted everything from the accelerated expansion of the Universe to recently confirmed ‘gravitational’ waves.

One such effort in the search for ‘dark matter’, ‘new physics’ and the fundamentals of particle physics & cosmology is the GAMBIT Collaboration project. GAMBIT is The Global And Modular BSM Inference Tool, and is made up of a collection of researchers from scientific intuitions worldwide, using the supercomputer Prometheus (amongst others) in the search for dark matter and a unified theory.
prom3

GAMBIT narrows the hiding places for ‘new physics’
The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences
“Is it possible for today’s apparatus to detect the elementary particles of ‘new physics’ that are capable of explaining such mysteries as the nature of dark matter or the lack of symmetry between matter and antimatter? To answer this question, scientists from the international GAMBIT (Global and Modular Beyond-the-Standard-Model Inference Tool) Collaboration have developed a set of software tools that comprehensively analyse data collected during the most sophisticated contemporary experiments and measurements.”
“Although almost a century has passed since Zwicky’s discovery, it has not been possible to investigate the composition of dark matter to this day, nor even to unambiguously confirm its existence. Over this time, theoreticians have constructed many extensions of the Standard Model containing particles that are to a greater or lesser extent exotic.”

prom1

The following article from Physics World explains GAMBIT, the history of the ‘discovery’ of ‘dark matter’ and the subsequent search for it’s identity’.
When supercomputers go over to the dark side
“Despite oodles of data and plenty of theories, we still don’t know what dark matter is. Martin White and Pat Scott describe how a new software tool called GAMBIT – run on supercomputers such as Prometheus – will test how novel theories stack up when confronted with real data”
“Unexpected scientific paradigm shifts, where reality turns out not to be as we believed, can be just as exciting and perplexing. One such dramatic change in perspective has been the dawning realization over the last few decades that “ordinary” matter accounts for just a fifth of the matter in the universe, with the rest made of a mysterious “dark” matter. Physicists love unsolved problems, and they don’t come much bigger than working out the nature of this dark stuff.
If a blockbuster movie is ever made about the discovery of dark matter, the next decade may well be the climax. New data from experiments such as CERN’s Large Hadron Collider (LHC) are telling us more about what dark matter can and cannot be, while the recent discovery of gravitational waves reminds us that even century-old theories (general relativity in this case) can be spectacularly confirmed in the blink of an eye”

gambitroom
The Cracow supercomputer Prometheus

GAMBIT project suggests theoretical particles are too massive for LHC detection
“The idea of the GAMBIT Collaboration is to create tools for analyzing data from as many experiments as possible, from different areas of physics, and to compare them very closely with the predictions of new theories. Looking comprehensively, it is possible to narrow the search areas of new physics much faster, and over time also eliminate those models whose predictions have not been confirmed in measurements,” explains Dr. Marcin Chrzaszcz
Verification of the new physics proposals takes place in the GAMBIT Collaboration as follows: Scientists choose a theoretical model and build it into the software. The program then scans the values of the main model parameters. For each set of parameters, predictions are calculated and compared to the data from the experiments.
“In practice, nothing is trivial here. There are models where we have as many as 128 free parameters. Imagine scanning in a space of 128 dimensions—it’s something that kills every computer. Therefore, at the beginning, we limited ourselves to three versions of simpler supersymmetric models, known under the abbreviations CMSSM, NUHM1 and NUHM2. They have five, six and seven free parameters, respectively. But things nonetheless get complicated, because, for example, we only know some of the other parameters of the Standard Model with a certain accuracy. Therefore, they have to be treated like free parameters too, only changing to a lesser extent than the new physics parameters,” says Dr. Chrzaszcz.

The GAMBIT Project website gives the following explanation –
“Welcome to the GAMBIT homepage. GAMBIT is a global fitting code for generic Beyond the Standard Model theories, designed to allow fast and easy definition of new models, observables, likelihoods, scanners and backend physics codes.”
gambit2

Thus far, ALL previous efforts with GAMBIT and Prometheus have led to dead ends (like all of modern QM physics), but it is worth noting that all previous efforts using Prometheus have been built upon or have expanded upon THE STANDARD MODEL!

What if this code, or at least these supercomputers where used for Mills’ GUT-CP and hydrino model? Entering a theory based upon Classical Laws? Expanding upon Newton, Maxwell equations… essentially used to enter ALL of Mills 30 year research, theories, predictions and equations… tested against known observables in the Universe?

I believe the results of doing so would be extraordinary accurate  compared to all previous attempts, and would be the beginning of creating a computer model/simulation of the Universe to such a degree of accuracy never before witnessed in mathematics, particle physics, Astro-physics and cosmology… essentially the beginning of creating a computer simulation of the history of the Universe, past, previous ;o, present and future… A COMPUTER SIMUALTION OF THE HISTORY OF THE UNIVERSE!
(as well as confirming hydrino as dark matter, the expansion/contraction of the Universe, gravitational waves and causes etc.)

Surely if all these Quantum Physicists, Astro-physicists and cosmologists are saying Mills is fundamentally wrong… take a risk and run it on a supercomputer such as Prometheus (fitting name considering!)… take a GAMBIT!

gambitcard
Gambit… ‘the ability to convert the potential energy stored in an inanimate object into pure light kinetic energy, thus “charging” that item with highly explosive results.’ 😀

“The intro to the awsomest cartoon ever! I’m going to walk down the aisle to this theme!”

Environment, Global Warming, hydrino, Planet Earth, Terraforming

‘Terraforming Planet Earth’ – Brilliant Light Power and ‘hydrino’ energy (California, Sahara, Africa, Oceans)

“The application of unlimited, unbounded energy are only constrained by human imagination, ingenuity and ambition” – Dr Randell Mills

Brilliant Light Power Terraforming Application Video
(The music! 😀 … No dolphins? 😦 )

“We need to start terraforming the planet in a positive and beneficial way for the sake of the survival of our species and all life on Earth… because at the moment we’re terraforming Earth like General Zod!…
Take California for example, severe drought and water shortages, catastrophic wild fires… … although when those fires reached Rupert Murdoch’s home, I was privately thinking ‘swings and roundabouts!’… … but we could create a global paradise of abundance if we utilised this technology correctly”

As the human population sets to exceed 10 billion by 2030, the constraints put on the planets eco-system and resources are becoming ever more apparent. 2018 has witnessed a number of clear signs of catastrophic global warming and climate change including continued drought and wildfires in California, severe heatwaves and wildfires in Greece and Europe, continued melting of the Arctic/Antarctic ice sheets, increased intensity in hurricanes and tropical storms, and the slowing of the Atlantic Gulf Stream. It is clear that humans are having a profound and consequential effect on the planets environment and eco-systems, in what scientists are calling the ‘Anthropocene Epoch’. Current research shows that at current living trends, humankind requires between 1.7 to 4 planets Earths to sustain our population demand. (dependant upon whether we choose to live like the average American citizen! :D)
something

Hydrino energy is high density, cheap, non- polluting, safe, and deployable anywhere in the world…

Obviously Brilliant Light Power and hydrino energy has the potential to eradicate the burning of ALL fossil fuels globally, thereby cutting global carbon emissions to ZERO!
Hydrino technology can also replace ‘renewable’ energy sources such as solar and wind farms, which would potentially take up millions of square miles of the Earths landmass to host. There is no radioactive or chemical waste, or any pollution of any kind.

But also Dr Mills and Brilliant Light Power envision a world where the use of ‘hydrino’ technologies will be used to transform the planets most inhospitable wastelands and arid regions into ‘lush, liveable, crop-producing expanses’ whilst ‘also preventing drought in already cultivated areas such as California.’

A recent article unveiled plans by scientists to transform areas of the Sahara into a lush rainfall region with abundant greenery.  Although the plan utilises a combination of wind turbines and solar panels, it highlights the possible ways in which future clean technologies will be used to positively alter and transform the Earths landscape.
SaharaDesertRainPower_web_1024
Scientists Have Announced an Incredible Plan to Make It Rain in The Sahara Desert
“Given everything we know about what fossil fuels are doing to the planet, the research offers a little glimpse of how alternative energy technologies could reveal surprising environmental advantages we’re not yet aware of.”
sahara-desert-rain-solar-wind-green-vegetation-1

I believe much more research should be conducted and careful consideration taken into the idea of transforming certain regions of the Earth, using hydrino power only. As pointed out in following article, transforming such areas as the Sahara may have unintended consequences for other regions such as the Amazon rainforest, and Atlantic ocean marine life…
8 Craziest Mega-Engineering Projects That Could Rework Earth
“For one, sand from the Sahara is carried into the air, across the Atlantic, and deposited in South America. The rich dust that falls from the sky, and the rain storms caused by that dust picking up moisture during it’s transoceanic journey both fertilize the Amazon rain forest. No desert, no dust. No dust, no rain forest. During that journey, the dust also feeds a variety of sea life.”

Brett Holverstott mentions in his talk, in regards to how hydrino energy has the potential to help alleviate, if not eradicate :-
Climate change
Ice cap melting
Ocean acidification
The dying of marine life
Being deployable in the third world
The cutting down of the Amazon rainforest
Global city smog
Construction of river dams
… amongst numerous other potential environmental benefits.

Quite simply, this technology has the potential to avoid the almost inevitable human catastrophe that is lurking around histories corner. I.e. Earths sixth biggest mass extinction event (including potentially our own).

I also believe our new understanding of atomic structure and molecular physics (Millsian), could pave the way for new cleaner technologies and sustainable materials in industries other than energy. These may include :-
– Plastics and materials
– Food and crop growth
– Fertilisers and agriculture
– Mining and mineral extraction
– Methods for cleaning up past environmental damage (plastics, carbon emissions, nuclear and toxic waste), essentially reversing the damage already caused by the Industrial Revolution.

“The application of unlimited, unbounded energy are only constrained by human imagination, ingenuity and ambition” – Dr Randell Mills

einstein
Einstein… clever guy… didn’t like Quantum Mechanics! 😀

“Before we venture out into the stars, looking to terraform other planets… maybe we should take a look back at our own and try and reverse the damage we’ve done here” – Danny Hurley

Robert Senftleben: Terraforming Planet Earth
Large parts of the surface of our planet have been devastated by human activity. Terraforming on a human scale is needed to bring these landscapes back to life. The knowledge and technology is there, and you can learn how to use it and participate.

EXCELLENT TED TALK!
How to green the world’s deserts and reverse climate change | Allan Savory
“Desertification is a fancy word for land that is turning to desert,” begins Allan Savory in this quietly powerful talk. And terrifyingly, it’s happening to about two-thirds of the world’s grasslands, accelerating climate change and causing traditional grazing societies to descend into social chaos. Savory has devoted his life to stopping it. He now believes — and his work so far shows — that a surprising factor can protect grasslands and even reclaim degraded land that was once desert.

Update (13/09/2-18) ;D
Gaia 2.0  (Timothy M. Lenton, Bruno Latour – University of Exeter)
gaia1

Famous theory of the living Earth upgraded to ‘Gaia 2.0’Famous theory of the living Earth upgraded to ‘Gaia 2.0’

September 13, 2018, University of Exeter

“A time-honoured theory into why conditions on Earth have remained stable enough for life to evolve over billions of years has been given a new, innovative twist.

For around half a century, the ‘Gaia’ hypothesis has provided a unique way of understanding how life has persisted on Earth.
It champions the idea that living organisms and their inorganic surroundings evolved together as a single, self-regulating system that has kept the planet habitable for life—despite threats such as a brightening Sun, volcanoes and meteorite strikes.
However, Professor Tim Lenton from the University of Exeter and famed French sociologist of science Professor Bruno Latour are now arguing that humans have the potential to ‘upgrade’ this planetary operating system to create “Gaia 2.0”.
They believe that the evolution of both humans and their technology could add a new level of “self-awareness” to Earth’s self-regulation, which is at the heart of the original Gaia theory.
As humans become more aware of the global consequences of their actions, including climate change, a new kind of deliberate self-regulation becomes possible where we limit our impacts on the planet.
Professors Lenton and Latour suggest that this “conscience choice” to self-regulate introduces a “fundamental new state of Gaia—which could help us achieve greater global sustainability in the future.
However, such self-aware self-regulation relies on our ability to continually monitor and model the state of the planet and our effects upon it.
Professor Lenton, Director of Exeter’s new Global Systems Institute, said: “If we are to create a better world for the growing human population this century then we need to regulate our impacts on our life support-system, and deliberately create a more circular economy that relies—like the biosphere—on the recycling of materials powered by sustainable energy.”
The original Gaia Theory was developed in the late 1960’s by James Lovelock, a British scientist and inventor. It suggested that both the organic and inorganic components of Earth evolved together as one single, self-regulating system which can control global temperature and atmospheric composition to maintain its own habitability.
The new perspective article is published in leading journal Science on September 14, 2018.
It follows recent research, led by Professor Lenton, which offered a fresh solution to how the Gaia hypothesis works in real terms: Stability comes from “sequential selection” in which situations where life destabilises the environment tend to be short-lived and result in further change until a stable situation emerges, which then tends to persist.
Once this happens, the system has more time to acquire further properties that help to stabilise and maintain it—a process known as “selection by survival alone”.
Creating transformative solutions to the global changes that humans are now causing is a key focus of the University of Exeter’s new Global Systems Institute.”

Gaia Hypothesis: Humans Have Fundamentally Altered Earth’s Self-Regulation System

cosmology, GUT-CP

Possible confirmation of Mills’ ‘Oscillating Universe’ and GUT-CP model? (remnants of a prior Universe)

“I actually emailed Roger Penrose with this last year.”
“Did you get a response?”
“No… but then he is ‘Sir’ Roger Penrose, and what we know of the Queens Knighthood list, is it reads like a God damn sex offen..
“Drop it Danny Boy!”

Planck-CMB-635x318

Recent observations made by both the Planck observatory and the BICEP2 South Pole telescope, indicate possible remnants of a previous Universe. This possible indication has been interpreted by a number of physicists in different ways, but Roger Penrose of Oxford University believes what is being seen in the Cosmic Microwave Background (CMB) data, are radioactive swirls dubbed ‘Hawking Holes’, thus being proof of a previous Universe existing prior to this present one, in what he and his colleagues call the “conformal cyclic cosmology” (CCC).
(It is worth noting these conclusions are drawn mainly from the data from the Planck observatory, and raw data from the BICEP2 is still to be released)

Radioactive swirls in the cosmos may rewrite the origin story of the universe
“The idea is called “conformal cyclic cosmology” (CCC), and what it asserts is that, rather than starting from a big bang, the universe continually expands and contracts, each time leaving behind tiny bits of electromagnetic radiation that remain as the process occurs over and over. The late Stephen Hawking predicted tiny dots of radiation, which others call ‘Hawking points’, left over from this cycle.”

These Swirls of Light Could Be Signs of a Previous Universe Existing Before Ours
“Penrose’s CCC model was developed as an answer to a curious imbalance between measurements of our early Universe’s temperature and the state of order we might expect.
According to him, this imbalance could be accounted for by the death of a pre-existing universe that was there before the Big Bang. Oscillating universes come in a few different forms, depending on your choice of model. Some suggest the Universe is destined to fall back in itself one day.”
bicep_mode_signal_swirls_of_light_1024

This observation, and it’s interpretation caught my attention, because according to Mills’ GUT-CP, we live in an ‘Oscillating Universe’, eternally expanding and contracting over a period of a trillion years or so (give or take). According to Mills, the Universe is in a continual state of expansion from 9 billion light years to 312 billion light years, and then contraction phase back (every 450 billion years). This process happens because, during expansion phase (or ‘annihilation’), matter is converted directly into energy (through various ‘hydrino’ chemical and nuclear processes throughout the Universe, including our own Sun), which causes spacetime to expand everywhere throughout the Universe.

NOTE – Dr Randell Mills successfully predicted the acceleration of the expanding Universe in his model prior to it’s discovery. He was also the first to successfully predict gravitational waves in to his model.

After the Universe has expanded to it’s peak radius (312 billion light years), and the engines of the expansion phase have ‘run out’ so to speak (stars, supernovas, neutron stars etc.), and most of the Universes matter has been converted into energy… the Universe will then begin it’s contraction phase and the process is reversed. Radiation will create particles, which in turn create atoms, converting ‘dark energy’ into matter, thus contracting spacetime everywhere throughout the Universe.

“The conversion of matter into energy causes spacetime, and thus the universe, to expand, since light has inertial but no gravitational mass. The acceleration of the expansion of the presently observed universe was predicted by Mills in 1995 and has since been confirmed experimentally. Mills predicts that the universe expands and contracts over thousand-billion year cycles.” – Brilliant Light Power

hydrinoexpansionuniverse
Brett Holverstott ‘Randell Mills and the Search for Hydrino Energy’ Pg 283… Pg. 288 “Mills’s universe did not start with a bang. It may have structures that predate the beginning of the expansion by many billions of years.”

It is important to note that The Big Bang Theory is just that… a theory, and no direct evidence for it has ever been put forward. The idea was based upon the fact that the Universe was expanding (prior to knowledge of it’s acceleration, which should in turn discount the theory). Many physicists and cosmologists throughout the 20th century questioned the theory, and by early 21st century a vast array of evidence has slowly accumulated to discount it.

– Reiss 1998. Hubble date showed that the Universe was NOT decelerating as predicted by the Big Bang Theory, but actually accelerating. Unknown to most in science, Mills had successfully predicted this two years prior in GUT-CP model. (this is when the idea of ‘dark’ matter’ came to the forefront of physics in order to account for this surprising observation).

– Space Circles Are Proof of a Pre-Big Bang Universe? (2010) Recycled-universe theory “works on paper,” but details missing, critics say.cmb– 3 Theories That Might Blow Up the Big Bang (Steinhardt and Turok)
“Steinhardt and Turok—working closely with a few like-minded colleagues—have now developed these insights into a thorough alternative to the prevailing, Genesis-like view of cosmology. According to the Big Bang theory, the whole universe emerged during a single moment some 13.7 billion years ago. In the competing theory, our universe generates and regenerates itself in an endless cycle of creation. The latest version of the cyclic model even matches key pieces of observational evidence supporting the older view.”
“We weren’t looking for cycles,” Steinhardt says, “but the model naturally produces them.” After a collision, energy gives rise to matter in the brane worlds. The matter then evolves into the kind of universe we know: galaxies, stars, planets, the works. Space within the branes expands, and at first the distance between the branes (in the bulk) grows too. When the brane worlds expand so much that their space is nearly empty, however, attractive forces between the branes draw the world-sheets together again. A new collision occurs, and a new cycle of creation begins. In this model, each round of existence—each cycle from one collision to the next—stretches about a trillion years. By that reckoning, our universe is still in its infancy, being only 0.1 percent of the way through the current cycle.
The cyclic universe directly solves the problem of before. With an infinity of Big Bangs, time stretches into forever in both directions. “The Big Bang was not the beginning of space and time,” Steinhardt says. “There was a before, and before matters because it leaves an imprint on what happens in the next cycle.”

– As these Forbes articles explain, it is now accepted amongst theoreticians that a singularity within the Big Bang Model is an impossibility within mathematics and the laws of physics.
There Was No Big Bang Singularity
The Big Bang Wasn’t The Beginning, After All

– As Holverstott states in ‘Randell Mills and the search for Hydrino energy’, numerous ‘ancient’ structures are being discovered throughout the Cosmos that seem to predate the accepted 13.6 billion years ‘beginning’.
~Including a ‘quasar that is 13 billion light years away, yet powered by a black hole about 2 billion times the mass of the Sun’ (Mortlock 2011).
~A star smaller than our own Sun, which has almost no trace of elements heavier than hydrogen or helium, with a ratio of helium lower than that theoretically created in the big bang. Dubbed ‘The Star That Should Not Exist’ (which is so sweet! :D)

eso1132a
“Hi! I’m SDSS J102915+172927, in the constellation Leo (The Lion), I’m older than 13 billion years and everyone says I shouldn’t have come into existence in the first place… WELL FUCK YOU!… … What do want? A picture?” 😀

– More recently a gargantuan black hole found in 2013, again throws doubt upon the notion of nothing in our ‘known’ Universe existing prior. I believe many more objects and structures will be found in the coming years and decades that will support Mils’ GUT-CP and his ‘Oscillating Universe’ Model.blackholequasar.png
– Young black hole had monstrous growth spurt
Super-massive object found in early Universe tests theories of cosmic evolution.
“A black hole that grew to gargantuan size in the Universe’s first billion years is by far the largest yet spotted from such an early date, researchers have announced. The object, discovered by astronomers in 2013, is 12 billion times as massive as the Sun, and six times greater than its largest-known contemporaries. Its existence poses a challenge for theories of the evolution of black holes, stars and galaxies, astronomers say.”

Mills model of an Oscillating Universe is NOT to be confused with other models such as the Big Bounce or CCC, which are still based in Quantum Models such as string or M theory… this is a truly original, more eloquent and simpler model, that has been arrived at through a classical understanding of atomic structure and gravitational forces (i.e hydrino model).
For further details see…
Summary Of Randell Mills’s Unified Theory (Holverstott)

and of course Randell Mills’ explanation itself in ‘The Grand Unified Theory Of Classical Physics, Volume III – Collective Phenomena, High Energy Physics, & Cosmology’
(Ch. 32.7 to 32.9.5 Cosmology, The Expanding Universe and Microwave , The Period of Background, The Period Of Oscillation etc.)
gut-cpvol3

Uncategorized

The Great Pyramid and The Ark Of The Covenant… IT WAS ME WHO FIRST SUGGESTED IT!

“The Bible speaks of the Ark leveling mountains and laying waste to entire regions. An army which carries the Ark before it is invincible.” ―Marcus Brody (Raiders Of The Lost Ark)

I remember the day exactly! Summer 2012… we where sat around discussing the Great Pyramid at it’s potential for being a technological or power device of some kind, and the prospect of missing components was brought up, and I said….
“Has anyone ever considered the Ark being THE missing piece?” 
(it makes the Exodus story slightly more amusing) 😀

Did Moses Grab the Ark & Run?
“Have you ever found it curious that as Moses was leading his people out of Egypt, the Pharaoh suddenly commanded his entire army to go after Moses and stop him? Why the change of heart? Why risk further judgment at the hand of God through Moses?”

But it seems that theory has been developed since…

ARK OF THE COVENANT. The return of the Ark of the Covenant to Beth-Shemesh (I Samuel 6:13)
ARK OF THE COVENANT. The return of the Ark of the Covenant to Beth-Shemesh (I Samuel 6:13). Wood engraving after Gustave Dore, c18th century.

The Great Pyramid of Giza—where the Ark of the Covenant was originally located?

According to ancient religious texts, the Ark of the Covenant also known as the Ark of the Testimony is a gold-covered wooden chest described in the Book of Exodus as containing the two stone tablets of the Ten Commandments. Named in various verses of the Torah and the Bible, the Ark remains a great mystery to historians. While some claim that it never existed, others still seek it, convinced that its existence is real.
What if the Ark of the Covenant was, in fact, real, and has a mysterious connection to one of the most enigmatic ancient structures on the surface of the planet: The Great Pyramid of Giza?

So what do we know about the Ark of the Covenant? We know it was a powerful ‘device’. Only a select few could approach it. When approached, anyone had to take extreme precautions and wear some sort of protective outfits. The ark was plated with gold.
Furthermore, we know from the Book of Exodus, that after the Israelites were released from Egypt, God summoned Moses to the peak of the holy mountain giving him two stone tablets that were carved with the ten commandments. At that moment, God provided Moses with exact instructions on how to build the Ark of the Covenant, one of the most enigmatic ‘devices’ in the history of mankind.

According to the Book of Exodus; the dimensions and characteristics of the Ark of the Covenant are 2½ cubits in length, 1½ in breadth, and 1½ in height which is approximately 131×79×79 cm or 52×31×31 in.

The entire Ark was plated with gold, and a crown of gold was put around it. Four rings of gold were attached to its four feet, two on each side and through these rings staves of shittim-wood plated with gold were placed to carry the Ark. these were never to be removed. A golden cover was placed above the Ark; also called kapporet. Interestingly, there are numerous researchers that suggest if the construction details of the Ark are those which were recorded in the past, then the ark would basically resemble an electrical capacitor with two electrodes separated by insulator drivers.

Strangely, it is believed that the Great Pyramid of Giza may have been deeply connected to the ark of the covenant, and it is hypothesized that the Great Pyramid of Giza would have been used to house the Ark of the Covenant.

The defenders of this idea, base their proof on the fact that the measuresments exposed in the sacred texts would coincide with the size of one of the sarcophagi of the Egyptian monument, where no mummy has ever been found.
As noted by Gerry Cannon from Crystallinks, “the word ark comes from the Hebrew word aron, which means a chest, box. Its dimensions are described by the bible as 2.5 cubits by 1.5 cubits by 1.5 cubits (45 inches by 27 inches by 27 inches). Curiously, this is the exact volume of the stone chest or porphyry coffer in the King’s Chamber in the Great Pyramid in Egypt. This coffer was the only object within the King’s Chamber, as the Ark was the single sacred object within the Holy of Holies, in the Temple. Also the laver, or basin, that the priests used to wash their feet had the identical cubit dimensions.”

Furthermore, Cannon elaborates that “The cubit dimensions of the inner chamber of the Temple, the Holy of Holies, are precisely identical in size to the King’s Chamber in the Pyramid and the same volume as the molten sea of water on the Temple Mount as prepared by King Solomon. Since the Pyramid was built and sealed long before the days of Moses, when he built the Ark and the Holy of Holies, and had remained sealed for over twenty-five centuries until the ninth century after Christ, there is no natural explanation for the phenomenon of both structures having identical volume measurements.”
No matter how far-fetched this theory may sound, various groups of scientists and historians are trying to determine the truth of it. In addition to the historical value that the Lost Ark would possess, many believe in the magical power of the object.

The Connection Between the Pyramid and the Ark of the Covenant

The Connection Between the Pyramid and the Ark of the Covenant
“This seems unlikely. Rather, it is more likely that the Holy of Holies and the Ark were relics from an earlier time, and were taken out of Egypt by the fleeing Israelites. The Ark is said to have once been kept in the King’s Chamber of the Great Pyramid. The famous ‘lidless’ coffin of Cheops was in actuality the receptacle for the Ark of the Covanant.”
The capacity of the Ark, based on its Biblical measurements, was 71,282 cubic inches, while the measure for the granite container in the King’s Chamber is 71,290 cublic inches. In 1955, Dr. Alfred Rutherford of the Institute of Pyramidology in Illinois performed an experiment in which he re-assembled the pieces of an exact replica of the Ark inside the King’s Chamber and lowered it into the Chamber’s stone box. It fit remarkably well, with a relatively uniform half-inch clearance on all four sides of the replica. Not without significance is the fact that the dimensions of the King’s Chamber itself form a double-cube—precisely the same dimensional configuration of the Hebrew Tabernacle Holy of Holies.
Was it here, inside the Great Pyramid, that the original Ark was first energized?
In the Book of Exodus chapter 25, verses 10 through 21, appears these descriptions of the Ark: it was a lidless rectangular box made of shittim wood (probably acacia) measuring two and a half cubits long by one and a half cubits wide and high—in terms of the Egyptian royal cubit, about 4 feet 4 inches by 2 feet 7 inches. The box was covered in gold over its inner and outer surfaces, with a gold crown or border around the top rim. Gold rings were added to each corner through which were inserted two carrying poles also made of gold-covered wood, that were designated never to be removed. On top of the box was placed a lid or “mercy seat” that matched the length and breadth dimensions of the box and was also covered with gold. On the lid were two sold gold statues of angelic beings called cherubim, with a winged figure place at either end of the lid and facing each other, with their four wings outstretched to form a canopy or arch.
In terms of its static electric charge potential, the gold coverings of the Ark form the positive and negative conductive layers, and the wood forms the insulator separating the two. The cherubim statues on the lid, with one figure connected with the outer gold layer and the other figure connected to the inner gold layer would have served as positive and negative terminals.
While a Leyden jar the size of a modern coffee jar can store a charge of approximately 200 volts, something the size of the Ark would have held a charge potential of several thousands of volts. Particularly in the hot dry air of the Sinai, the Ark could have stored enough static electricity to have been fatal to anyone even coming close to it. This is precisely what happened as portrayed in Hebrew literature on two occasions. In Leviticus 10: 1-2, two priests, Nadab and Abihu, failed to approach the Ark in a prescribed manner, “And there went out a fire, and devoured them.” In Second Samuel 6: 6-7, while the Ark was being transported to Jerusalem by oxen, and was being shaken badly, another priest named Uzzah tried to steady the box by placing his hands on it. “And God smote him there for his error, and there he died by the Ark of God.”
Other tell-tale electrical elements can be seen in these additional features: …………………….
http://www.forgottenagesresearch.com/out-of-place-artifacts-series/Enigmas-of-the-Ark-of-the-CovenantAn-Ancient-Techn.htm

THE RELATIONSHIP BETWEEN THE ARK OF THE COVENANT AND THE KING’S COFFIN

Many of the pyramidologists in the 19th century have pointed out an amazing correlation. The volume or cubic capacity of the Coffer in the King’s chamber is exactly the same volume to the Ark of the Covenant as described in the Bible. Could there be some common measurement that was used that goes back to antiquity? Could there be some common builders involved? It has also been shown that the “pyramid inch”, which is discussed in our pyramidology sections, is the same unit of measurement that was used to build “Noah’s Ark, “Solomon’s Temple”, and the “Ark of the Covenant”. We will also discuss in another article the relationship between a standard Egyptian unit of area and its relationship to Stonehenge.
In our article on the “Arab who got the shock of his life on the summit of the Great Pyramid” we mentioned the possibility that there may be interesting electro-static phenomena on the top of the Great Pyramid. If we go back to some ancient legends about the Ark of the Covenant, we find some interesting statements. The Ark of the Covenant was placed in the most Holy of Holies and could only be approached once a year by the High Priest. It was considered so sacred that it was believed that if the High Priest or anyone who came near it and had any impure thoughts, they would be struck dead with a bolt of lightning. Here is a little unknown fact. What the Israelites would do was to tie a rope to the leg of the High Priest when he went in to the Holy of Holies incase he was struck dead with lightning. If that happened, they could just pull him out with the rope and therefore not risk someone else being killed by going in. Do you remember the Indiana Jones movie “Raiders of the Lost Ark”, when the nazi’s approached the Ark and were all struck dead with a bolt of lightning? This was based on actual legend. Also in the Bible there was an instance when someone touched the Ark in order to prevent it from falling and they were also struck dead instantly. Is this just mythology or is there some basis in these occurrences?
We know from the Bible the ark was made of acacia wood and lined inside and out with gold. What we have here is two conductors separated by an insulator. That is a capacitor. It has been calculated that this Ark might have been able to act as a capacitor and was able to produce an electric discharge of over 500 volts. Thus, this would cause the type of phenomena mentioned in the Bible associated with the Ark. Why did the Israelite army always march to war with the Ark in the front? There is much interesting speculation here.
Why this is important to pyramid research is that the Great Pyramid may have some interesting electro-static producing effects, especially on the summit.
Our association will be proposing a research proposal to study the physical effects on the summit and elsewhere in the Great Pyramid. Using specific physical apparatus, we would like to take measurements and do experiments especially on the summit. We would also like to carry out some experiments at the point where the original pyramid with the capstone would have been. We would need a platform to reach this height. Did the Great Pyramid somehow act as a capacitor and for what purpose. This could be investigated using direct measurements and testing.

 

Uncategorized

How Great Pyramid was ‘used as a MACHINE to alter cosmic rays’… ‘similar to a cold fusion reaction’ (KGB documents?)

I know the U.S military conducted a survey on the Great Pyramid in late 1970’s (78?)… CLASSIFIED… all we know is they came to the conclusion that it was a technological device of some kind.

  1. It’s wasn’t built by Khufu
  2. It’s older than 4500 years
  3. It’s a precision built engineering device of some kind… it served and technological function.

The rest you have to try and figure out for yourself… it’s like the planets greatest puzzle!
Youtubes Ancient Architects has done some splendid research and has some pretty good ideas on what it’s function was…







Anyway, as we say in UK… “Let’s see vat KGB think bout dis” 😀

Egypt REVELATION: How Great Pyramid was ‘used as a MACHINE to alter cosmic rays’

EGYPT’s Great Pyramid holds “powers to alter cosmic rays”, according to secret KGB documents that claim the pyramid is intimately connected with Earth.

The tallest pyramid ever constructed in Egypt, the Great Pyramid, was considered to be a “wonder of the world” by ancient writers. Some 4,500 years ago, the ancient Egyptians built the Great Pyramid of Giza as a tomb for the pharaoh Khufu, also known as Cheops, one that would ferry him to the afterlife. According to secret KGB documents, scientists began to wonder if the entire Giza Plateau was designed and engineered for a different purpose.

In Amazon Prime’s “The Secret KGB Abduction Files”, scientists and theorists question and examine the possibilities that the Great Pyramid operates as a “paranormal machine”.
One scientist made an astonishing claim that “the pyramids, or certainly the Great Pyramid, was a machine [because] it had some sort of purpose.
He added: “The measurements are extremely precise.”

The ratio between the height of the pyramid and the perimeter of the pyramid is the same as the ratio between the radius of the Earth and the circumference of the Earth.

So you can actually see the pyramid is sort of like a three-dimensional triangular depiction of a hemisphere which might suggest that there was some reason to have the pyramid resonate with the planet.
The scientist concludes that the “pyramids have powers to alter cosmic rays”.
The pyramids are in effect huge prisms, capable of concentrating energy. Capturing the light from the stars would initiate a process that would turn the Great Pyramid into an interstellar transmitter.
Theorists claim the three pyramids and Sphinx could be integral parts of an immense machine designed by alien engineers.

According to scientific reports, all the monuments of Giza are linked by a master control mechanism inside the Great Pyramid.
Inside the Great Pyramid, a passageway leads to the king’s chamber, and above the sarcophagus is a tunnel some call the star shaft.
When a specific celestial alignment occurs, starlight streaks down the shaft. These scientist theorise that radiant energy striking the sarcophagus could initiate something similar to a cold fusion reaction.
The prism structure of the pyramid would magnify and transfer the energy to the other monuments.

 

 

Uncategorized

Neurons From People With Autism Grow Differently, Scientists Find

“Whomever said being on the Asperger’s scale was a disorder?”  :/ “I imagine it’s quite orderly for some… it’s everyone else who’s got the fucking disorder!” 😀

Seriously… I know researchers and autism ‘experts’ that quietly believe it may be the next stage of human evolution (these aren’t spiritual pseudoscientists)

It doesn’t need curing or treating… the rest of world should learn to adapt to those on the spectrum! (even if it is sometimes pretty severe)

Nerve Cells In People With Autism Develop Faster And Grow Larger, Study Finds

In a “very exciting finding”, researchers at California’s Salk Institute have discovered a difference between the way nerve cells develop in people who have autism spectrum disorder (ASD) and those who don’t. The researchers hope their study will contribute to a better understanding of why ASD develops, helping to create better diagnostic techniques and treatments.
ASD is a disability that impacts how people communicate and interpret the feelings and behaviors of others, as well as how they experience the world around them. Most commonly diagnosed in children, particularly boys, it impacts one in every 59 kids in the US. There is currently no cure and the exact cause or causes of the condition are still unclear – both genetics and hyperconnectivity within the brain are believed to play a role.
“It’s currently hypothesized that abnormalities in early brain development lead to autism, but the transition from a normally developing brain to an ASD diagnosis is blurred,” said Simon Schafer, a postdoctoral fellow at the Salk Institute, in a statement. “A major challenge in the field has been to determine the critical developmental periods and their associated cellular states. This research could provide a basis for discovering the common pathological traits that emerge during ASD development.”
To look at how nerve cells, aka neurons, develop in people with ASD, the researchers took skin cell samples from eight people with ASD and five people without the condition and transformed them into pluripotent stem cells. These are stem cells with the ability to turn into any type of cell in the body. By exposing the cells to certain chemicals, the team could then direct them to develop into neurons.
They then used molecular “snapshots” to look at genetic activity in the cells at different developmental stages by analyzing their RNA (a molecule mainly involved in protein production that can contain genetic information). They took a look at the cells at five different points and found something interesting early on in their development.

ASDSalkInstituteStemCells.jpg
A two-dimensional culture of cortical neurons, stained in red and green, grown from induced pluripotent stem cells created from volunteers’ skin cells.

Neurons From People With Autism Grow Differently, Scientists Find

Researchers announced this week that they may have helped illuminate another small piece of the puzzle that is autism spectrum disorder (ASD), a developmental disorder that can impact social communication and behavior.
In a paper published in the journal Nature Neuroscience, an international team of neuroscientists described a process that used participants’ skin cells to eventually grow neurons in petri dishes. Their observations of the neurons’ growth could help unravel some of the mysteries about early brain development in ASD.

The group, led by Fred Gage at the Salk Institute in La Jolla, California, collected skin cells from eight volunteers diagnosed with ASD and five non-diagnosed volunteers who acted as the control group. Then, using technology that was announced in 2006, the team rolled back the developmental clock. They converted those skin cells into what are called induced pluripotent stem cells, which have the power to mature into any type of cell in the body. In this case, Gage and his team guided the cells’ development (which played out in petri dishes containing so-called growth culture) using specific chemical factors so that they’d end up as neurons.
As they watched the cells grow, the researchers tracked which genes were expressed — or switched on and used to create things like proteins — and when. They found that cells created from participants with ASD expressed some genes earlier than cells from the control group. Past research has linked those genes with an increased risk of ASD. Additionally, neurons spawned from the cells of participants with ASD sprouted more complex branches and grew faster than control cells. Understanding how those developmental changes affect the brain as a while could lead to a better understanding of the neurology behind ASD.
“Although our work only examined cells in cultures,” says Gage in a press release, “it may help us understand how early changes in gene expression could lead to altered brain development in individuals with ASD. We hope that this work will open up new ways to study neuropsychiatric and neurodevelopmental disorders.”

Stem cells used to trace autism back to the formation of neurons

Gene-activity changes come before any visible differences in neurons.

While autism is a spectrum of disorders, it’s clear that the more significant cases involve physical differences in the brain’s nerve cells. Several studies have reported an excess in connections among neurons in the brains of people with autism. But when does this happen? Changes in neural connections are key components of learning and memory, and they can happen at any point in life; major reorganizations in connectivity occur from before birth up to the late teens.
Anecdotal reports of autism’s symptoms often suggest an onset between one and two years old. But a new study places the critical point extremely early in embryo development—at a point before there are any mature nerve cells whatsoever.
A series of challenges
Figuring out how autism starts is complicated. To begin with, it’s a spectrum that might include more than one disorder. You also can’t know in advance who’s going to develop it, so you can only look at it retrospectively, after the problems are apparent. Finally, the human brain is simply not something you can ethically do invasive experiments on.
The new work relies on techniques that weren’t available just a few decades ago. We now know how to take skin cells and convert them to stem cells. We’re able to direct stem cells to develop along the lineages that contribute to brain development. And we can structure that development in three dimensions to produce a miniature version of the mature tissue, termed an organoid. Combined, these approaches allow us to study the development of autism using nothing more than a small skin sample from autistic individuals.
For the new research, a large international team obtained skin cells from eight autistic people and five controls. These were converted into stem cells and then induced to develop along a pathway that leads to brain-like neurons. This pathway includes an intermediate step, called a neural stem cell, in which the cells are committed to developing as nerve cells but haven’t adopted a mature, specialized nerve cell identity (mature cells belong to distinct populations, like serotonin-producing dentate gyrus cells, etc.). As had been seen in past studies, the mature nerve cells derived from autistic individuals created very complex patterns of branching axons compared to control cells.
At five different time points during the development of these cells, the researchers separated out the nerve cells or nerve-cells-to-be. Then they obtained all the RNA from the cells, which provides a window into gene activity. Next, the researchers performed a computational analysis to identify groups of genes that were active at specific steps. This identified three distinct groups of genes (which they termed “modules”) that defined distinct stages of the developmental process. You can think of these stages as pre-neuron, neural stem cell, and maturing neuron.
Accelerated development
When these modules were compared in cells from autistic individuals and controls, there weren’t many differences in the two that marked later stages of development. The earliest active module, however, appeared to be active on an accelerated schedule in the cells that came from autistic individuals. In other words, while normal cells might reach a given stage of gene activity at day four, those from autistic patients might reach that at day two. This accelerated pace was also apparent in the physical changes the cells undergo as they mature.
The earliest two modules also contain a number of genes that had previously been identified as enhancing the risk of autism. And expression of some of these genes at early stages in the process could mimic the progression of autism, accelerating the developmental process.
The timing of all of this suggested to the authors that the problems in these autistic individuals came from the process of forming neural stem cells. This sets the stage for problems in everything that comes after it.
To test this idea, the authors came up with a clever solution. People have identified a way to bypass the neural stem cell stage of the process and force stem cells to develop directly into neurons. (Surprisingly, all this takes is the expression of a single gene.) If the specification of neural stem cells is where things go wrong, then skipping it entirely might rescue the problems. And, in fact, it does. The complexity of neural branching was similar in the experimental and control cells when neurons were generated using this approach.
We haven’t “solved” autism
It’s important to emphasize that this research doesn’t mean we’ve “solved” autism in any way. The participants in this study were selected as having a single symptom that clearly placed them on the autism spectrum; it’s not clear whether these results will apply to those who are on the spectrum due to other symptoms. And there’s a big difference between knowing something goes wrong during neural stem cell generation and knowing what, exactly, has gone wrong. So there’s still a lot of work to do here.
But the results do indicate that, at least in some individuals with autism, problems start extremely early. In humans, neural stem cells are specified before three weeks into the pregnancy—a point when many people aren’t even aware or certain they’re pregnant. Depending on how general this is, that may mean that interventions at the earliest stages of autism—either by directly addressing the problem or by limiting any environmental influences that promote autism—is pretty unlikely.
While this is an impressive body of work on its own, what’s really striking is how it puts together so many techniques that are relatively recent developments. These include the use of stem cells to study diseases that are otherwise difficult to address experimentally, the ability to do large-scale RNA sequencing, and the algorithms that let us analyze this data—all are relatively recent developments. Biology is filled with incremental developments, and it’s only when you stop to consider what had to happen before research like this was even possible that the rate of progress can be appreciated.

Uncategorized

Aliens! :D (I just piss myself laughing every time the topic is brought up now)

“An advanced extra-terrestrial species? One (or possibly billions) that are capable of intergalactic travel, possibly millions or billion of years advanced than us… that could very well be right here visiting Earth, and have been since the beginning, possibly being our creators (or creators of DNA)… … I don’t even think the human mind can even comprehend it!

To ‘meet’ one or to interact with one would be too unfathomable for the human mind… that’s the reason I DON’T believe in ‘alien conspiracies’ as such. Conspiracies between a group/s of humans and an extra-terrestrial species. I think the Governments, militaries and ‘powers that be’ know they exist, know they are visiting us, most likely have proof of their existence… but there is no ‘conspiracy’ or collusion between them.
(The conspiracy being the shitheads like The Vatican and other religious orders would do EVERYTHING they can to hide it, because it threatens their shitty little fantasy world they’ve created for everyone)

That, and I don’t believe humans are actually the most intelligent species on this planet… dolphins are! 😀
(So why would an intelligent alien species waste it’s time with us? It would probably ask the dolphins if they want us wiped out… and if the dolphins had any common sense they would reply with an astounding ‘YES PLEASE!’)

I think through all this talk of radio contact, searching for exoplanets and signs of civilisation… the most practical method, and the only way to even try and grasp such possibilities, and to possibly make contact… … psychedelic compounds such as DMT and Ayahuasca… exploring consciousness through ‘shamanic’ practises.

THE ZOO HYPOTHESIS MAKES ABSOLUTE COMPLETE SENSE TO ME!

Aliens ARE out there! Harvard astronomer predicts FIRST CONTACT in shock claim

HUMANS will be in for a huge shock when – not if – we encounter aliens, according to one of the world’s top scientists.
The chairman of the Harvard Astronomy Department said humanity’s first encounter will almost be unfathomable. Meeting extraterrestrials would be the biggest step in human history, according to Professor Avi Loeb of Harvard University. He believes it is one of the last remaining steps we have to take as a species, and believes it is a matter of when.

Avi Loeb on the Mysterious Interstellar Body ‘Oumuamua
‘Thinking About Distant Civilizations Isn’t Speculative’

Astronomer Avi Loeb believes that the interstellar object dubbed ‘Oumuamua could actually be a probe sent by alien beings. Given the evidence that has so far been gathered, he says, it is a possible conclusion to draw.

The Aliens Before Us –“We are Not the First Technological Civilization” (Or, are We?)

We live in a universe where matter is distributed in a hundred billion galaxies, each containing a hundred billion stars, made up of quantum fields where space and time are not existent, that manifest themselves in the form of particles, such as electrons and photons, or as waves. Tucked into the 14-billion-year history of this vast observable universe with 100 trillion planets is a pale blue dot teeming with life and a technological civilization created by a strange species known as homo sapiens.
Are we an aberration, an evolutionary accident, or are we one of millions of evolving beings scattered throughout the distant reaches of the cosmos?
In June of 2016, The New York Times attempted to answer this great unanswered question of the human species, publishing an op-ed titled, “Yes, There Have Been Aliens.”
In a brilliant display of intuition vs evidence, astrophysicist Adam Frank at the University of Rochester and author of “Light of the Stars: Alien Worlds and the Fate of the Earth”, proposed that “while we do not know if any advanced extraterrestrial civilizations currently exist in our galaxy, extraterrestrial civilizations almost certainly existed at one time or another in the evolution of the cosmos. the degree of pessimism required to doubt the existence, at some point in time, of an advanced extraterrestrial civilization borders on the irrational. We now have enough information to conclude that they almost certainly existed at some point in cosmic history.”
Frank writes that this probability is not an abstraction, not just a pure number. Instead, he says, it represents something very real: “10 billion trillion planets existing in the right place for nature to have at it. Each world is place where winds may blow over mountains, where mists may rise in valleys, where seas may churn and rivers may flow. (Note our solar system has two worlds in the Goldilocks zone — Earth and Mars — and both have had winds, seas and rivers). When you hold that image in your mind, you see something remarkable: The pessimism line actually represents the 10 billion trillion times the universe has run its experiment with planets and life.”

Frank’s argument have their appeal, countered Ross Andersen in The Atlantic, but it is an appeal to intuition: “The simple fact is that no matter how much we wish to live in a universe that teems with life—and many of us wish quite fervently—we haven’t the slightest clue how often it evolves. Indeed, we aren’t even sure how life arose on this planet. We have our just-so stories about lightning strikes and volcanic vents, but no one has come close to duplicating abiogenesis in a lab. Nor do we know whether basic organisms reliably evolve into beings like us.”
Evolutionary biologist Wentao Ma and collaborators, observes Frank, used computer simulations to show that the first replicating molecules could have been short strands of RNA that were easy to form and which quickly led to a “takeover” by DNA. And, as neurobiologist and leading expert on evolution of intelligence, Lori Marino has argued, human intelligence evolved on top of cognitive structures that already had a long history of life on Earth. Thus our kind of intelligence should no longer be seen as entirely separated from what evolved before.

For what purpose did the human brain evolve is a question that has puzzled scientists for decades, and was answered in 2010 by Colin Blakemore, an Oxford neurobiologist who argued that a mutation in the brain of a single human being 200,000 years ago turned intellectually able primates into a super-intelligent species that would conquer the world. Homo sapiens appears to be genetic accident.
We are the only species of the billions of species that have existed on Earth that has shown an aptitude for radios and even we failed to build one during the first 99% of our 7 million year history, according to Australia National University’s Charles Lineweaver.
Genetic studies suggest every living human can be traced back to a single woman called “Mitochondrial Eve” who lived about 200,000 years ago, Blakemore said in an interview with The Guardian. He suggested that “the sudden expansion of the brain 200,000 years ago was a dramatic spontaneous mutation in the brain of Mitochondrial Eve or a relative which then spread through the species. A change in a single gene would have been enough.”
Blakemore stressed that the plasticity that our brains were enhanced with when this mutation occurred. Some scientists, he pointed out, “believe that skills like language have a strong genetic basis, but my theory stresses the opposite, that knowledge, picked up by our now powerful brains, is the crucial mental component. It means that we are uniquely gifted in our ability to learn from experience and to pass this on to future generations.”
The huge and logical downside to Blakemore’s theory is that with a single generation starved of knowledge, thanks to some Six Mass Extinction global disaster, for example, would be cast back to the Stone Age. “Everything, Blakemore observes, :would be undone. On the other hand, there is no sign that the human brain has reached its capacity to accumulate knowledge, which means that the wonders we have already created – from spaceships to computers – represent only the start of our achievements.”
“The universe gets to run the experiment many, many times, writes Frank. “So if you want to argue Earth is unique, then the onus is on you to show why technological intelligence is so strongly selected against.”
We can’t extrapolate from our existence on Earth, counters Andersen, because it’s only one data point. We could be the only intelligent beings in the universe, he writes, “or we could be one among trillions, and either way Earth’s natural history would look the exact same. Even if we could draw some crude inferences, the takeaways might not be so reassuring. It took two billion years for simple, single-celled life to spawn our primordial lineage, the eukaryotes.
“And so far as we can tell, he continued, “it only happened once. It took another billion years for eukaryotes to bootstrap into complex animal life, and hundreds of millions of years more for the development of language and sophisticated tool-making. And unlike the eye, or bodies with legs—adaptations that have arisen independently on many branches of life’s tree—intelligence of the spaceship-making sort has only emerged once, in all of Earth’s history. It just doesn’t seem like one of evolution’s go-to solutions.”
In 2012, Princeton astrophysical sciences professor Edwin Turner and lead author David Spiegel, with the Institute for Advanced Studies, analyzed what is known about the likelihood of life on other planets in an effort to separate the facts from the mere expectation that life exists outside of Earth. The researchers used a Bayesian analysis — which weighs how much of a scientific conclusion stems from actual data and how much comes from the prior assumptions of the scientist — to determine the probability of extraterrestrial life once the influence of these presumptions is minimized.
Their study argued that the idea that life has or could arise in an Earth-like environment has only a small amount of supporting evidence, most of it extrapolated from what is known about abiogenesis, or the emergence of life, on early Earth. Instead, their analysis showed that the expectations of life cropping up on exoplanets — those found outside Earth’s solar system — are largely based on the assumption that it would or will happen under the same conditions that allowed life to flourish on this planet.

In fact, the researchers concluded, the current knowledge about life on other planets suggests that it’s very possible that Earth is a cosmic aberration where life took shape unusually fast. If so, then the chances of the average terrestrial planet hosting life would be low.
“Fossil evidence suggests that life began very early in Earth’s history and that has led people to determine that life might be quite common in the universe because it happened so quickly here, but the knowledge about life on Earth simply doesn’t reveal much about the actual probability of life on other planets,” Turner said.
In conclusion, it appears that the choice between intuition or evidence is yours to make.

“Great Known Unknown” –The Number of Galaxies Beyond the Observable Universe

According to current measurements, the size of the cosmos must be larger than a hundred billion light-years. This is the order of magnitude of the universe we have indirect access to, writes physicist Carlo Rovelli. “It is around 1060 times greater than the Planck length, a number of times that is given by a 1 followed by sixty zeroes. Between the Planck scale and the cosmological one, then, there is the mind-blowing separation of sixty orders of magnitude.”
In this space, adds Rovelli in Reality Is Not What It Seems-The Journey to Quantum Gravity—”between the size of the minute quanta of space, up to quarks, protons, atoms, chemical structures, mountains, stars, galaxies (each formed by one hundred billion stars), clusters of galaxies, and right up until the seemingly boundless visible universe of more than a hundred billion galaxies—unfolds the swarming complexity of our universe, a universe we know in only a few aspects. Immense. Huge. Extraordinarily huge. But finite.”
Astronomers are confident that the volume of space-time within range of our telescopes—‘the universe’—is only a tiny fraction of the aftermath of the big bang. “We’d expect far more galaxies located beyond the horizon, unobservable,” says the renowned astrophysicist, Martin Rees, “each of which (along with any civilizations it hosts) will evolve rather like our own.”
One of the most fundamental known unknowns in astronomy is just how many galaxies the universe contains. The Hubble Deep Field images, captured in the mid 1990s, revealed untold numbers of faint galaxies. It was estimated that the observable Universe contains between 100 to 200 billion galaxies.
“It boggles the mind that over 90% of the galaxies in the Universe have yet to be studied. Who knows what we will find when we observe these galaxies with the next generation of telescopes,” says astronomer and Google Scholar Christopher Conselice at the University of Nottingham, who led the team that discovered that there are ten times more galaxies in the universe than previously thought, and an even wider space to ultimately search for extraterrestrial life.
In 2016, astronomers using data from the NASA/ESA Hubble Space Telescopes and other telescopes performed an accurate census of the number of galaxies, and came to the surprising conclusion that there are at least 10 times as many galaxies in the observable universe as previously thought. The image itself was produced by the Frontier Fields Collaboration (a joint effort between NASA’s Hubble, Spitzer, and Chandra space telescopes) allowing scientists to detect galaxies that are as much as 100 times fainter than those independently captured before.
The international team, reports Nature, led by Conselice from the University of Nottingham, UK, have shown that this figure is at least ten times too low. Conselice and his team reached this conclusion using deep space images from Hubble, data from his team’s previous work, and other published data . They painstakingly converted the images into 3D, in order to make accurate measurements of the number of galaxies at different times in the Universe’s history.
In addition, they used new mathematical models which allowed them to infer the existence of galaxies which the current generation of telescopes cannot observe. This led to the surprising realization that in order for the numbers to add up, some 90% of the galaxies in the observable Universe are actually too faint and too far away to be seen — yet.

All that the human species will be able to view after a hundred billion years, will be the dead and dying stars of our Local Group. But these, says Martin Rees in On the Future, who was not part of Conselice’s team, “could continue for trillions of years—time enough, perhaps, for the long-term trend for living systems to gain complexity and ‘negative entropy’ to reach a culmination. All the atoms that were once in stars and gas could be transformed into structures as intricate as a living organism or a silicon chip—but on a cosmic scale. Against the darkening background, protons may decay, dark matter particles annihilate, occasional flashes when black holes evaporate—and then silence.”
We can only see a finite number of galaxies because there’s a horizon, a shell around us, delineating the greatest distance from which light can reach us. But that shell, observes Rees, “has no more physical significance than the circle that delineates your horizon if you’re in the middle of the ocean.”

In analyzing the data the team looked more than 13 billion years into the past. This showed them that galaxies are not evenly distributed throughout the Universe’s history. In fact, it appears that there were a factor of 10 more galaxies per unit volume when the Universe was only a few billion years old compared with today. Most of these galaxies were relatively small and faint, with masses similar to those of the satellite galaxies surrounding the Milky Way.
These results are powerful evidence that a significant evolution has taken place throughout the Universe’s history, an evolution during which galaxies merged together, dramatically reducing their total number. “This gives us a verification of the so-called top-down formation of structure in the Universe,” explains Conselice.
The decreasing number of galaxies as time progresses also contributes to the solution of Olbers’ Paradox — why the sky is dark at night. The astronomer Heinrich Olbers argued that the night sky should be permanently flooded by light, because in an unchanging Universe filled with an infinite number of stars, every single part of the sky should be occupied by a bright object. However, our modern understanding of the Universe is that it is both finite and dynamic — not infinite and static.
The team came to the conclusion that there is such an abundance of galaxies that, in principle, every point in the sky contains part of a galaxy. However, most of these galaxies are invisible to the human eye and even to modern telescopes, owing to a combination of factors: redshifting of light, the Universe’s dynamic nature and the absorption of light by intergalactic dust and gas, all combine to ensure that the night sky remains mostly dark.
Image at the top of the page shows Hubble image of Abell 2744. The light in the image comes from dead, ghost galaxies torn apart long ago by the cluster’s gravitational forces, and their stars were scattered into what is known as intracluster space — the space between the galaxies.

Could Extraterrestrial Sugar Explain How Life Began on Earth?

Extraterrestrial Sugar
Scientists have discovered derivatives of life’s building blocks in carbon-rich meteorite samples, a first. They also showed how biological compounds can form in interstellar space. These new findings support the theory that life on Earth originated with help from cosmic impacts.
Sugars and sugar derivatives are essential to life on Earth. But they, along with amino acids and other organic molecules, can be found in space as well, on asteroids and comets. Scientists have suggested that objects in space may have fallen to Earth and delivered the compounds that would spark biological processes on our planet.

Sugar and Ice
In this new study, scientists analyzed five residues from ice mixtures exposed to ultraviolet radiation in conditions simulating the interstellar medium in space. The goal was to see whether organic molecules found in life on Earth would form in a simulated space environment. In these residues, they found 2-deoxyribose, or the sugar component that makes up the “D” in DNA. They also found derivatives of 2-deoxyribose, similar compounds that have one atom or a group of atoms that are different.
“Astrochemistry ice photolysis experiments, such as those described in our paper, provide a convincing explanation on how those compounds may form in such astrophysical environments,” lead researcher Michel Nuevo of NASA Ames Research Center said about these experiments in an email.
There are many theories surrounding the origins of life on Earth. Scientists think that biological compounds like 2-deoxyribose may have played a role in the formation of Earth’s first organisms. Some have even suggested that these biological compounds formed in the abiotic environment of space, aboard objects like comets, asteroids, meteoroids, and interplanetary dust particles. Previous studies have shown how biological compounds might form in space, and they could have fallen to Earth early on in its history when bombardment by asteroids and comets was more common.
In short, this study demonstrated that biological compounds like 2-deoxyribose can form in a noon-biological environment.
“Our paper, together with several other papers describing similar astrochemistry experiments published in the last 25 years or so, show that a very wide variety of compounds of biological interest can be formed under abiotic (i.e., non-biological) conditions in astrophysical environments,” lead researcher Michel Nuevo of NASA Ames Research Center in an email.
Still Searching for DNA
In addition to this analysis, the researchers were able to identify some of these deoxy sugar derivatives in carbonaceous, or carbon-rich, meteorite samples for the first time ever. This proved that these biological compounds can be produced in a space environment. However, while the team found 2-deoxyribose in the laboratory experiments, they were unable to find the DNA component in the meteorite samples analyzed.
According to Nuevo, while this work doesn’t solve the mystery of how life on Earth originated, it shows how it is quite likely that meteorites have deposited biological compounds on Earth throughout history.
“Since asteroids and comets routinely crash onto the surface of planets, including the Earth, in the form of meteorites, it is obvious that large amounts of organic compounds, including compounds of biological interest, are routinely dumped on our continents and in our oceans, the same way they are probably dumped onto other planets of the solar system,” he said. “This does not explain how life originated on our planet more than 4 billion years ago, as nobody knows how those organic compounds could combine into the even more complex structures required for life to get started. But it shows that sugar derivatives and other compounds of biological interest are present and are probably dumped onto planets everywhere in the galaxy.”
This work is published in the journal Nature Communications.

Uncategorized

‘Israel needs national vision for AI or risks falling behind, tech authority says’… YO BOYS! :D

Israel needs national vision for AI or risks falling behind, tech authority says

Israel Innovation Authority urges government, academia, industry to join forces for advances in artificial intelligence as global race is underway… blah blah blah

“‘Artificial  Intelligence’ is a term everyone throws around without understanding what it really means… true Artificial Intelligence is far from being developed (500 years?), what most people coin as ‘artificial intelligence’ is just advanced algorithms… it is not truly a ‘learning’ machine, a self aware and sentient being, and still nowhere what we perceive as ‘consciousness’
(and Quantum computers will NEVER become a reality)

… But I know of a guy!” 😀
(much of this has disappeared from the public domain for some strange reason)

Top Differences Between Artificial Intelligence, Machine Learning & Deep Learning

😀

aiai1ai2ai4ai5
ai6ai7ai8

Method and system for pattern recognition and processing
Dec 23, 1998
The present invention provides a method and system for pattern recognition and processing. Information representative of physical characteristics or representations of physical characteristics is transformed into a Fourier series in Fourier space within an input context of the physical characteristics that is encoded in time as delays corresponding to modulation of the Fourier series at corresponding frequencies. Associations are formed between Fourier series by filtering the Fourier series and by using a spectral similarity between the filtered Fourier series to determine the association based on Poissonian probability. The associated Fourier series are added to form strings of Fourier series. Each string is ordered by filtering it with multiple selected filters to form multiple time order formatted subset Fourier series, and by establishing the order through associations with one or more initially ordered strings to form an ordered string. Associations are formed between the ordered strings to form complex ordered strings that relate similar items of interest. The components of the invention are active based on probability using weighting factors based on activation rates.
Description
This application claims the benefit of U.S. provisional application Ser. No. 60/068,834, filed Dec. 24, 1997.
BACKGROUND OF THE INVENTION
Attempts have been made to create pattern recognition systems using programming and hardware. The state of the art includes neural nets. Neural nets typically comprise three layers—an input layer, a hidden layer, and an output layer. The hidden layer comprises a series of nodes which serve to perform a weighted sum of the input to form the output. Output for a given input is compared to the desired output, and a back projection of the errors is carried out on the hidden layer by changing the weighting factors at each node, and the process is reiterated until a tolerable result is obtained. The strategy of neural nets is analogous to the sum of least squares algorithms. These algorithms are adaptive to provide reasonable output to variations in input, but they can not create totally unanticipated useful output or discover associations between multiple inputs and outputs. Their usefulness to create novel conceptual content is limited; thus, advances in pattern recognition systems using neural nets is limited.
SUMMARY OF THE INVENTION
The present invention is directed to a method and system for pattern recognition and processing involving processing information in Fourier space.
The system of the present invention includes an Input Layer for receiving data representative of physical characteristics or representations of physical characteristics capable of transforming the data into a Fourier series in Fourier space. The data is received within an input context representative of the physical characteristics that is encoded in time as delays corresponding to modulation of the Fourier series at corresponding frequencies. The system includes a memory that maintains a set of initial ordered Fourier series. The system also includes an Association Layer that receives a plurality of the Fourier series in Fourier space including at least one ordered Fourier series from the memory and forms a string comprising a sum of the Fourier series and stores the string in memory. The system also includes a String Ordering Layer that receives the string from memory and orders the Fourier series contained in the string to form an ordered string and stores the ordered string in memory. The system also includes a Predominant Configuration Layer that receives multiple ordered strings from the memory, forms complex ordered strings comprising associations between the ordered strings, and stores the complex ordered strings to the memory. The components of the system are active based on probability using weighting factors based on activation rates.
Another aspect of the present invention is directed to ordering a string representing the information. This aspect of the invention utilizes a High Level Memory section of the memory that maintains an initial set of ordered Fourier series. This aspect of the invention includes obtaining a string from the memory and selecting at least two filters from a selected set of filters stored in the memory. This aspect also includes sampling the string with the filters such that each of the filters produce a sampled Fourier series. Each Fourier series comprises a subset of the string. This aspect also includes modulating each of the sampled Fourier series in Fourier space with the corresponding selected filter such that each of the filters produce an order formatted Fourier series. Furthermore, this aspect includes adding the order formatted Fourier series produced by each filter to form a summed Fourier series in Fourier space, obtaining an ordered Fourier series from the memory, determining a spectral similarity between the summed Fourier series and the ordered Fourier series, determining a probability expectation value based on the spectral similarity, and generating a probability operand having a value selected from a set of zero and one, based on the probability expectation value. These steps are repeated until the probability operand has a value of one. Once the probability operand has a value of one, this aspect includes storing the summed Fourier series to an intermediate memory section. Thereafter, this aspect includes removing the selected filters from the selected set of filters to form an updated set of filters, removing the subsets from the string to obtain an updated string, and selecting an updated filter from the updated set of filters. This aspect further includes sampling the updated string with the updated filter to produce a sampled Fourier series comprising a subset of the string, modulating the sampled Fourier series in Fourier space with the corresponding selected updated filter to produce an updated order formatted Fourier series, recalling the summed Fourier series from the intermediate memory section, adding the updated order formatted Fourier series to the summed Fourier series to form an updated summed Fourier series in Fourier space, and obtaining an updated ordered Fourier series from the memory. This aspect further includes determining a spectral similarity between the updated summed Fourier series and the updated ordered Fourier series, determining a probability expectation value based on the spectral similarity, and generating a probability operand having a value selected from a set of zero and one, based on the probability expectation value. These steps are repeated until the probability operand has a value of one or all of the updated filters have been selected from the updated set of filters. If all of the updated filters have been selected before the probability operand has a value of one, then clearing the intermediate memory section and repeating the steps starting with selecting at least two filters from a selected set of filters. Once the probability operand has a value of one, the updated summed Fourier series is stored to the intermediate memory section and steps beginning with removing the selected filters from the selected set of filters to form an updated set of filters are repeated until one of the following set of conditions is satisfied: the updated set of filters is empty or the remaining subsets of the string is nil. If the remaining subsets of the string is nil, then the Fourier series in the intermediate memory section is stored in the High Level Memory section of the memory.
Another aspect of the present invention is directed to forming complex ordered strings by forming associations between a plurality of ordered strings. This aspect of the invention includes recording ordered strings to the High Level Memory section, forming associations of the ordered strings to form complex ordered strings, and recording the complex ordered strings to the High Level Memory section. A further aspect of the invention is directed to forming a predominant configuration based on probability. This aspect of the invention includes generating an activation probability parameter, storing the activation probability parameter in the memory, generating an activation probability operand having a value selected from a set of zero and one, based on the activation probability parameter, activating any one or more components of the present invention such as matrices representing functions, data parameters, Fourier components, Fourier series, strings, ordered strings, components of the Input Layer, components of the Association Layer, components of the String Ordering Layer, and components of the Predominant Configuration Layer, the activation of each component being based on the corresponding activation probability parameter, and weighting each activation probability parameter based on an activation rate of each component.
Novel method and system for pattern recognition and processing using data encoded as Fourier series in Fourier space
Article in Engineering Applications of Artificial Intelligence 19(2):219-234 · March 2006

Abstract
A method and system for pattern recognition and processing is reported that has a data structure and theoretical basis that are unique. This novel approach anticipates the signal processing action of an ensemble of neurons as a unit and intends to simulate aspects of brain that give rise to capabilities such as intelligence, pattern recognition, and reasoning that have not been reproduced with past approaches such as neural networks that are based individual simulated ”neuronal units.” Information representative of physical characteristics or representations of physical characteristics is transformed into a Fourier series in Fourier space within an input context of the physical characteristics that is encoded in time as delays corresponding to modulation of the Fourier series at corresponding frequencies. Associations are formed between Fourier series by filtering the Fourier series and by using a spectral similarity between the filtered Fourier series to determine the association based on Poissonian probability. The associated Fourier series are added to form strings of Fourier series. Each string is ordered by filtering it with multiple selected filters to form multiple time order formatted subset Fourier series, and by establishing the order through associations with one or more initially ordered strings to form an ordered string. Associations are formed between the ordered strings to form complex ordered strings that relate similar items of interest. The components of the system based on the algorithm are active based on probability using weighting factors based on activation rates