Time is fundamental, it is much of what ‘being’ is about. It is central to reality. It is central to our lived experience, it is central to our hopes and dreams. But as central as it is, it is still an enigma.
Time is a knotty problem for physics, metaphysics, philosophy, religion, something fundamental to our existence and experience but for thousands of years and billions of person hours of contemplation and analysis it escapes understanding. Like others down the centuries I find that the more I think about it the harder to grasp it becomes.
Pragmatically there is only the local now, a few moments from the past and a glance into the future. Practically there is the Past and the Future, now is just a transition from one to the other.
What is time? It seems like it is about change, and times arrow is provided by entropy, the slow winding down of the universe.
Existence, the now, is only the Plank Time instant. What stitches the universe together are memory(enabled by change) and imagination (enabled by memory.)
One option of quantum physics says that it is the conscious mind that ‘collapses’ the probability function to one reality. In that view it is our mind-memory that provides a crashing rock against which universal potentiality breaks into reality. Is it us, stitching together the universe?
Why do we talk about timespace? Because time has no meaning without space and space no meaning without time. Imagine an infinite cube of arbitrary complexity. Without time nothing about it has any meaning. You cannot travel from one point to another, there is no energy, because no movement, nothing can move, because movement is about change of location and that has no meaning with no time. Equally, without space, time has no meaning, there is nothing to change, one could say something can endure or wind down but without space for that to occur it has no meaning.
So we ‘live’ in timespace that we instantiate and make objective. It is still real in that the physics of it are fixed (probably) but is it possible that it is our (or other consciousnesses) that take possibility and harden it to reality and inflate the universe around us, out to the limits of our questing minds?
The first known interstellar object to visit our solar system, 1I/2017 U1 ‘Oumuamua, was discovered Oct. 19, 2017 by the University of Hawaii’s Pan-STARRS1 telescope, funded by NASA’s Near-Earth Object Observations (NEOO) Program, which finds and tracks asteroids and comets in Earth’s neighborhood. While originally classified as a comet, observations revealed no signs of cometary activity after it slingshotted past the Sun on Sept. 9, 2017 at a blistering speed of 196,000 miles per hour (87.3 kilometers per second). It was briefly classified as an asteroid until new measurements found it was accelerating slightly, a sign it behaves more like a comet.
The second image is to make you think. Given one of our very powerful telescopes that faint dot circled in the center is all we ever saw of Oumuamua. With our computational tools we could detect that it was accelerating and get an idea of the surface composition but the data we collected was negligible (though also amazing given the distance and velocity of this objectively tiny object.)
Extraterrestrial: On ‘Oumuamua as Artifact
by PAUL GILSTER on FEBRUARY 23, 2021
The reaction to Avi Loeb’s new book Extraterrestrial (Houghton Mifflin Harcourt, 2021) has been quick in coming and dual in nature. I’m seeing a certain animus being directed at the author in social media venues frequented by scientists, not so much for suggesting the possibility that ‘Oumuamua is an extraterrestrial technological artifact, but for triggering a wave of misleading articles in the press. The latter, that second half of the dual reaction, has certainly been widespread and, I have to agree with the critics, often uninformed.
The article in CentauriDreams, as always excellent, discusses the reaction to the book which is very much in line with the arguments of the book itself.
The author of the Book a Harvard Astronomer of high repute, says that the data actually points to Oumuamua being an artifact and that since that theory best fits the data…then it is/was an extraterrestrial visitor. He then goes on review other theories and the way that the science community came together to present a ‘consensus’ that was more about PR and making the life of the average person in the broad community of sky explorers easier rather than doing the hard work of explaining multiple theories and sets of data that left the question very open and leaving a starkly amazing option in play.
Essentially this is about the science and the science community but also about Journalism in its debauched epoch. Many of us grew up with science being pushed as a noble, maybe the last noble, adventure. With heroes and a few villains. Heroes of the mind and of letters and video who didn’t get shot at or mugged or even have to live rough. Carl Sagan, Attenborough, many other names come to mind.
The problem is that these men and women were scientists, academics, with deep knowledge, if often deeply attached to one trope, and great communicators. Far too many of those who followed were/are attached to a trope and its alignment with their desired outcome. Without the background/willingness to understand that even the most beautiful theory may be utterly wrong and always HAS to be able to stand up to any counter evidence presented.
Also the scientific community, once quite a small community is now huge, with all the pressures of a large bureaucratic endeavor to go along to get along; careerism; group think; cliques; etc. And especially in ‘charismatic’ endeavors like space the pressure is to be ‘in the consensus’ and ‘never be caught wrong footed in the lime light.’
Axiom is not as famous as SpaceX or BlueOrigin, even Boeing or NG but it is setting up to be a big noise in commercial space. “Axiom Space, Inc., which is developing the world’s first commercial space station, has raised $130M in Series B funding”
In January 2020, NASA selected Axiom to begin attaching its own space station modules to the International Space Station (ISS) as early as 2024, marking the company as a primary driver of NASA’s broad strategy to commercialize LEO. While in its assembly phase, Axiom Station will increase the current usable and habitable volume on ISS and provide expanded research opportunities. By late 2028, Axiom Station will be ready to detach when the ISS is decommissioned and operate independently as its privately owned successor.
From the above ParabolicArc article.
But they are already in the ride share business, setting up launches of multiple smaller missions on one booster, Axiom buying the ride then working with the launch customers to integrate their satellites on the mission bus. Another recent milestone:
Seems like there must have been a mash up of astrophysics/cosmology/cybernetics a couple of weeks ago there have been a series of articles about computers and the universe. One series pointing out that once could conceive of using the AGB stars in their ‘dusting mode’ (above) as a computing engine.
But on the other side there have been a couple of articles that touch on the metaphysical (philosophical basis of reality) concept that we and our universe, are one vast simulation.
…Oxford philosopher Nick Bostrom’s philosophical thought experiment that the universe is a computer simulation. If that were true, then fundamental physical laws should reveal that the universe consists of individual chunks of space-time, like pixels in a video game. “If we live in a simulation, our world has to be discrete,”….
….a discrete field theory, which views the universe as composed of individual bits and differs from the theories that people normally create. While scientists typically devise overarching concepts of how the physical world behaves, computers just assemble a collection of data points…..
…A novel computer algorithm, or set of rules, that accurately predicts the orbits of planets in the solar system….
… devised by a scientist at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL), applies machine learning, the form of artificial intelligence (AI) that learns from experience, to develop the predictions.
Qin (pronounced Chin) created a computer program into which he fed data from past observations of the orbits of Mercury, Venus, Earth, Mars, Jupiter, and the dwarf planet Ceres. This program, along with an additional program known as a ‘serving algorithm,’ then made accurate predictions of the orbits of other planets in the solar system without using Newton’s laws of motion and gravitation. “Essentially, I bypassed all the fundamental ingredients of physics. I go directly from data to data,” Qin said. “There is no law of physics in the middle…
…”Usually in physics, you make observations, create a theory based on those observations, and then use that theory to predict new observations,” said PPPL physicist Hong Qin, author of a paper detailing the concept in Scientific Reports. “What I’m doing is replacing this process with a type of black box that can produce accurate predictions without using a traditional theory or law.”…
Ok so now I am going to go a bit sideways and you may want to just go on about your internet day. But while I laude Qin and his team I have a bit of an issue with what he claims re the basis is Philosophy. Not the claim that the discrete field theory sparked his concept exploration. But that the actual system he developed has anything to say about that metaphysical theory.
Taking nothing away from the team what I see seems like a straightforward application of machine learning. In fact a relatively simple one though I would laude the whole idea of applying it to physics in general. A very interesting though, like many interesting insights, oddly obvious is retrospect. (Sorry for the repeated Though clauses…I absolutely see this as fascinating insight…and possibly extremely important…it just seems like D’oh in retrospect.)
As physics is very much aligned with mathematics (I think because the discovery of each was feedback on the other) and mathematics and cybernetics are also deeply intwined it should come as no surprise that computer systems designed to create black box solutions, when fed the right kind of data, will create a black box model of physical phenomena.
The output of science are tools that allow us to predict finite things about the universe we live in, repeatably and accurately. These tools are often used by engineers to enable technologyy that make life better for everyone.
But in many ways this is an engineers (relatively narrow) viewpoint. To some large degree an engineer does not care why the tool works, only that it does and how accurately. Counter to that, a strength of the theory based + mathematical model approach is that it gives you a tool to link the rest of reality to the ‘discrete’ piece you are working on right now. A jumping off point or a linking point to other theories that allows us to move onto other problems and link the
And/But (you knew it was coming) i wonder if this has anything to do with discrete field theory per se. Maybe if the learning algorithm used had that in it this would show something of that nature, but otherwise I do not see this as showing anything in particular other than the ability of learning systems which are in some sense continuous not discrete systems to develop predictive models directly from the data (as Qin says) rather than through the labor intensive methods of theory extraction and proof that has been the basis for scientific exploration since it first evolved in the Middle Ages.
Again BUT, it has been getting harder to develop these ‘deep’ theories. Look at the colliders and other tools that physicists use these days to probe the depths of our reality. In this world there are many things, like Qin’s next test with Nuclear Fusion, where an engineering model might be much more valuable than a ‘theory of this’ if it can be captured and used in a fraction of the time.
It’s all good, fascinating, wonderful…but let’s not get ahead of ourselves.