View allAll Photos Tagged combinatorial
Mikhail Nekhemyevich Tal[a] (9 November 1936 – 28 June 1992) was a Soviet-Latvian chess player and the eighth World Chess Champion. He is considered a creative genius and is widely regarded as one of the most influential chess players. Tal played in an attacking and daring combinatorial style. His play was known above all for improvisation and unpredictability. Vladislav Zubok said of him, "Every game for him was as inimitable and invaluable as a poem".
His nickname was "Misha", a diminutive for Mikhail, and he earned the nickname "The Magician from Riga". Both The Mammoth Book of the World's Greatest Chess Games and Modern Chess Brilliancies[6] include more games by Tal than any other player. He also held the record for the longest unbeaten streak in competitive chess history with 95 games (46 wins, 49 draws) between 23 October 1973 and 16 October 1974, until Ding Liren's streak of 100 games (29 wins, 71 draws) between 9 August 2017 and 11 November 2018. In addition, Tal was a highly regarded chess writer.
Tal died on 28 June 1992 in Moscow, Russia
High Resolution Images emailed to you are for Sale via email at $25 per image. If I print and send them the charge is $40. If I send them framed(16x20) it becomes $200.
You can pay to my PayPal account.
a simple capture of acomplex optical sculpture made of mirrors and thin neon lights...
A hypercube of dimension n has 2n "sides" (a 1-dimensional line has 2 end points; a 2-dimensional square has 4 sides or edges; a 3-dimensional cube has 6 2-dimensional faces; a 4-dimensional tesseract has 8 cells). The number of vertices (points) of a hypercube is 2n (a cube has 23 vertices, for instance).
A simple formula to calculate the number of "n-2"-faces in an n-dimensional hypercube is: 2n2 − 2n
The number of m-dimensional hypercubes (just referred to as m-cube from here on) on the boundary of an n-cube is
, where and n! denotes the factorial of n.
For example, the boundary of a 4-cube (n=4) contains 8 cubes (3-cubes), 24 squares (2-cubes), 32 lines (1-cubes) and 16 vertices (0-cubes).
This identity can be proved by combinatorial arguments; each of the 2n vertices defines a vertex in a m-dimensional boundrary. There are ways of choosing which lines ("sides") that defines the subspace that the boundrary is in. But, each side is counted 2m times since it has that many vertices, we need to divide with this number. Hence the identity above.
Sudoku (数独,) (English pronunciation: /suːˈdoʊkuː/ soo-DOH-koo) is a logic-based, combinatorial number-placement puzzle. The objective is to fill a 9×9 grid with digits so that each column, each row, and each of the nine 3×3 sub-grids that compose the grid (also called "boxes", "blocks", "regions", or "sub-squares") contains all of the digits from 1 to 9. The puzzle setter provides a partially completed grid, which typically has a unique solution. Completed puzzles are always a type of Latin square with an additional constraint on the contents of individual regions.
The puzzle was popularized in 1986 by the Japanese puzzle company Nikoli, under the name Sudoku, meaning single number. It became an international hit in 2005.
Today's Bible Verse:
All Scripture is God-breathed and is useful for teaching, correcting and training in righteousness.
2 TIMOTHY 3:16
Chi è ciascuno di noi se non una combinatoria d'esperienze, d'informazioni, di letture, d'immaginazioni? Ogni vita è un'enciclopedia, una biblioteca, un inventario d'oggetti, un campionario di stili, dove tutto può essere continuamente rimescolato e riordinato in tutti i modi possibili.
Italo Calvino
Who is each of us if not a combinatorial of experiences, information, readings, of imaginations? Every life is an encyclopedia, a library, an inventory of objects, a collection of styles, where everything can be constantly shuffled and reordered in every ways possible.
Italo Calvino
I have an updated processing of this shot www.flickr.com/photos/combinatorial/4226387051/
My first photo to make Explore... Dec 16, 2007, #365
Chi è ciascuno di noi se non una combinatoria d'esperienze, d'informazioni, di letture, d'immaginazioni? Ogni vita è un'enciclopedia, una biblioteca, un'inventario d'oggetti, un campionario di stili, dove tutto può essere continuamente rimescolato e riordinato in tutti i modi possibili.
Italo Calvino
Who is each of us if not a combinatorial of experiences, information, readings, of imaginations? Every life is an encyclopedia, a library, an inventory of objects, a collection of styles, where everything can be constantly shuffled and reordered in every way possible.
Italo Calvino
NOTE: this is a semi-log graph, so a straight line is an exponential; each y-axis tick is 100x. This graph covers a 100,000,000,000,000,000,000x improvement in computation/$.
I have color coded it to show the transition among the integrated circuit architectures. I also added the current NVIDIA workhorses — the A100 and H100. You can see how the mantle of Moore's Law has transitioned most recently from the GPU (green dots) to the ASIC (yellow and orange dots), and the H100 itself is a transitionary species — from GPU to ASIC, with 8-bit performance optimized for AI models. Remember, there are thousands of invisible dots below the frontier of humanity's capacity to compute (e.g., everything from Intel in the past 13 years).
Tesla DOJO's dominance should not be a surprise, as Intel ceded leadership to NVIDIA a decade ago, and further handoffs were inevitable. The computational frontier has shifted across many technology substrates over the past 120 years, most recently from the CPU to the GPU to ASICs optimized for neural networks (the majority of new compute cycles).
Of all of the depictions of Moore’s Law, this is the one (originally by Ray Kurzweil) that I find to be the most useful, as it captures what customers actually value — computation per constant dollar.
Humanity’s capacity to compute has compounded for as long as we can measure it, exogenous to the economy, and starting long before Intel co-founder Gordon Moore noticed a refraction of the longer-term trend in the belly of the fledgling semiconductor industry in 1965.
Why the transition within the integrated circuit era? Intel lost to NVIDIA for neural networks because the fine-grained parallel compute architecture of a GPU maps better to the needs of deep learning. There is a poetic beauty to the computational similarity of a processor optimized for graphics processing and the computational needs of a sensory cortex, as commonly seen in neural networks today. A custom chip (like the Tesla D1 ASIC) optimized for neural networks extends that trend to its inevitable future in the digital domain. Further advances are possible in analog in-memory compute, an even closer biomimicry of the human cortex. The best business planning assumption is that Moore’s Law, as depicted here, will continue for the next 20 years as it has for the past 120.
For those unfamiliar with this chart, here is a more detailed description:
Moore's Law is both a prediction and an abstraction
Moore’s Law is commonly reported as a doubling of transistor density every 18 months. But this is not something the co-founder of Intel, Gordon Moore, has ever said. It is a nice blending of his two predictions; in 1965, he predicted an annual doubling of transistor counts in the most cost effective chip and revised it in 1975 to every 24 months. With a little hand waving, most reports attribute 18 months to Moore’s Law, but there is quite a bit of variability. The popular perception of Moore’s Law is that computer chips are compounding in their complexity at near constant per unit cost. This is one of the many abstractions of Moore’s Law, and it relates to the compounding of transistor density in two dimensions. Others relate to speed (the signals have less distance to travel) and computational power (speed x density).
Unless you work for a chip company and focus on fab-yield optimization, you do not care about transistor counts. Integrated circuit customers do not buy transistors. Consumers of technology purchase computational speed and data storage density. When recast in these terms, Moore’s Law is no longer a transistor-centric metric, and this abstraction allows for longer-term analysis.
What Moore observed in the belly of the early IC industry was a derivative metric, a refracted signal, from a longer-term trend, a trend that begs various philosophical questions and predicts mind-bending futures.
Ray Kurzweil’s abstraction of Moore’s Law shows computational power on a logarithmic scale, and finds a double exponential curve that holds over 120 years! A straight line would represent a geometrically compounding curve of progress.
Through five paradigm shifts – such as electro-mechanical calculators and vacuum tube computers – the computational power that $1000 buys has doubled every two years. For the past 35 years, it has been doubling every year.
Each dot is the frontier of computational price performance of the day. One machine was used in the 1890 Census; one cracked the Nazi Enigma cipher in World War II; one predicted Eisenhower’s win in the 1956 Presidential election. Many of them can be seen in the Computer History Museum.
Each dot represents a human drama. Prior to Moore’s first paper in 1965, none of them even knew they were on a predictive curve. Each dot represents an attempt to build the best computer with the tools of the day. Of course, we use these computers to make better design software and manufacturing control algorithms. And so the progress continues.
Notice that the pace of innovation is exogenous to the economy. The Great Depression and the World Wars and various recessions do not introduce a meaningful change in the long-term trajectory of Moore’s Law. Certainly, the adoption rates, revenue, profits and economic fates of the computer companies behind the various dots on the graph may go though wild oscillations, but the long-term trend emerges nevertheless.
Any one technology, such as the CMOS transistor, follows an elongated S-shaped curve of slow progress during initial development, upward progress during a rapid adoption phase, and then slower growth from market saturation over time. But a more generalized capability, such as computation, storage, or bandwidth, tends to follow a pure exponential – bridging across a variety of technologies and their cascade of S-curves.
In the modern era of accelerating change in the tech industry, it is hard to find even five-year trends with any predictive value, let alone trends that span the centuries. I would go further and assert that this is the most important graph ever conceived.
Why is this the most important graph in human history?
A large and growing set of industries depends on continued exponential cost declines in computational power and storage density. Moore’s Law drives electronics, communications and computers and has become a primary driver in drug discovery, biotech and bioinformatics, medical imaging and diagnostics. As Moore’s Law crosses critical thresholds, a formerly lab science of trial and error experimentation becomes a simulation science, and the pace of progress accelerates dramatically, creating opportunities for new entrants in new industries. Boeing used to rely on the wind tunnels to test novel aircraft design performance. Ever since CFD modeling became powerful enough, design moves to the rapid pace of iterative simulations, and the nearby wind tunnels of NASA Ames lie fallow. The engineer can iterate at a rapid rate while simply sitting at their desk.
Every industry on our planet is going to become an information business. Consider agriculture. If you ask a farmer in 20 years’ time about how they compete, it will depend on how they use information, from satellite imagery driving robotic field optimization to the code in their seeds. It will have nothing to do with workmanship or labor. That will eventually percolate through every industry as IT innervates the economy.
Non-linear shifts in the marketplace are also essential for entrepreneurship and meaningful change. Technology’s exponential pace of progress has been the primary juggernaut of perpetual market disruption, spawning wave after wave of opportunities for new companies. Without disruption, entrepreneurs would not exist.
Moore’s Law is not just exogenous to the economy; it is why we have economic growth and an accelerating pace of progress. At Future Ventures, we see that in the growing diversity and global impact of the entrepreneurial ideas that we see each year. The industries impacted by the current wave of tech entrepreneurs are more diverse, and an order of magnitude larger than those of the 90’s — from automobiles and aerospace to energy and chemicals.
At the cutting edge of computational capture is biology; we are actively reengineering the information systems of biology and creating synthetic microbes whose DNA is manufactured from bare computer code and an organic chemistry printer. But what to build? So far, we largely copy large tracts of code from nature. But the question spans across all the complex systems that we might wish to build, from cities to designer microbes, to computer intelligence.
Reengineering engineering
As these systems transcend human comprehension, we will shift from traditional engineering to evolutionary algorithms and iterative learning algorithms like deep learning and machine learning. As we design for evolvability, the locus of learning shifts from the artifacts themselves to the process that created them. There is no mathematical shortcut for the decomposition of a neural network or genetic program, no way to "reverse evolve" with the ease that we can reverse engineer the artifacts of purposeful design. The beauty of compounding iterative algorithms (evolution, fractals, organic growth, art) derives from their irreducibility. And it empowers us to design complex systems that exceed human understanding.
Why does progress perpetually accelerate?
All new technologies are combinations of technologies that already exist. Innovation does not occur in a vacuum; it is a combination of ideas from before. In any academic field, the advances today are built on a large edifice of history. . This is why major innovations tend to be 'ripe' and tend to be discovered at the nearly the same time by multiple people. The compounding of ideas is the foundation of progress, something that was not so evident to the casual observer before the age of science. Science tuned the process parameters for innovation, and became the best method for a culture to learn.
From this conceptual base, come the origin of economic growth and accelerating technological change, as the combinatorial explosion of possible idea pairings grows exponentially as new ideas come into the mix (on the order of 2^n of possible groupings per Reed’s Law). It explains the innovative power of urbanization and networked globalization. And it explains why interdisciplinary ideas are so powerfully disruptive; it is like the differential immunity of epidemiology, whereby islands of cognitive isolation (e.g., academic disciplines) are vulnerable to disruptive memes hopping across, much like South America was to smallpox from Cortés and the Conquistadors. If disruption is what you seek, cognitive island-hopping is good place to start, mining the interstices between academic disciplines.
It is the combinatorial explosion of possible innovation-pairings that creates economic growth, and it’s about to go into overdrive. In recent years, we have begun to see the global innovation effects of a new factor: the internet. People can exchange ideas like never before Long ago, people were not communicating across continents; ideas were partitioned, and so the success of nations and regions pivoted on their own innovations. Richard Dawkins states that in biology it is genes which really matter, and we as people are just vessels for the conveyance of genes. It’s the same with ideas or “memes”. We are the vessels that hold and communicate ideas, and now that pool of ideas percolates on a global basis more rapidly than ever before.
In the next 6 years, three billion minds will come online for the first time to join this global conversation (via inexpensive smart phones in the developing world). This rapid influx of three billion people to the global economy is unprecedented in human history, and so to, will the pace of idea-pairings and progress.
We live in interesting times, at the cusp of the frontiers of the unknown and breathtaking advances. But, it should always feel that way, engendering a perpetual sense of future shock.
Cyanotype, traditional iron salt party mix, combinatorially grappled in a head-shaped tub, brushed onto gelatin-sized vellum, subsequently exposed to Sol for an amount of time -- in the winter Texas air for ten minutes perhaps -- Finally, developed casually, while smoking, in water, vinegar, ammonia and tea-tannins.
Sa pratique est celle d’un artiste mais aussi d’un chercheur ou d’un hétérotopologue, tel que défini par Foucault dans son texte «les espaces autres». Cette recherche et construction de sens dans le ‘’liminal’’ ou ‘’l’entre deux’’ l’amène à produire des sculptures automatisées en effondrement, des images animées infinies qui tournent en boucle sur elles-mêmes, des chimères et accumulations linguistiques de mouvements artistiques inexistants.
Chimera est une installation composée d’un ensemble de barres qui génère aléatoirement toutes les minutes des associations de préfixes et de mouvements, tendances politiques, artistiques, économiques et religieuses. Chaque barre peut fonctionner individuellement ou en essaim programmé. La richesse combinatoire génère une infinité de propositions existantes, absurdes, inventives, sombres, anachroniques, utopiques. Chimera interroge la composition structurelle et temporelle du langage et la façon dont ce dernier conditionne la lecture et le partitionnement de l’histoire, notamment artistique. ‘‘Chimera’’ tend à être un outil d’ouverture à la création de nouvelles mouvances et pensées futures, mais c’est aussi inversement un outil de critique tendant vers une forme ‘‘d’épuisement ’’ des possibles par le recyclage de formes existantes ou passées de façon accélérée.
His practice is that of an artist but also of a researcher or a heterotopologist, as defined by Foucault in his text “other spaces”. This search and construction of meaning in the ''liminal'' or ''the in-between'' leads him to produce automated collapsing sculptures, infinite animated images that loop on themselves, chimeras and accumulations linguistics of non-existent artistic movements.
Chimera is an installation made up of a set of bars which randomly generates associations of prefixes and movements, political, artistic, economic and religious trends every minute. Each bar can operate individually or in a programmed swarm. Combinatorial wealth generates an infinity of existing propositions, absurd, inventive, dark, anachronistic, utopian. Chimera questions the structural and temporal composition of language and the way in which the latter conditions the reading and partitioning of history, particularly artistic history. ''Chimera'' tends to be a tool of openness to the creation of new movements and future thoughts, but it is also conversely a tool of criticism tending towards a form of ''exhaustion'' of possibilities through the recycling of existing or past forms in an accelerated manner.
I took this photo of the latest hot lot of processor chips of various sizes at the spook shop summit (InQTel CEO Summit). Pretty shiny bling.
I am in the D-Wave board meeting now, and we just got a peek of next week's TIME Magazine cover (below). And it made the Charlie Rose show.
Here are some excerpts:
"The Quantum Quest for a Revolutionary Computer
The D-Wave Two is an unusual computer, and D-Wave is an unusual company. It's small, just 114 people, and its location puts it well outside the swim of Silicon Valley. But its investors include the storied Menlo Park, Calif., venture-capital firm Draper Fisher Jurvetson, which funded Skype and Tesla Motors. It's also backed by famously prescient Amazon founder Jeff Bezos and an outfit called In-Q-Tel, better known as the high-tech investment arm of the CIA. Likewise, D-Wave has very few customers, but they're blue-chip: they include the defense contractor Lockheed Martin; a computing lab that's hosted by NASA and largely funded by Google; and a U.S. intelligence agency that D-Wave executives decline to name.
The reason D-Wave has so few customers is that it makes a new type of computer called a quantum computer that's so radical and strange, people are still trying to figure out what it's for and how to use it. It could represent an enormous new source of computing power--it has the potential to solve problems that would take conventional computers centuries, with revolutionary consequences for fields ranging from cryptography to nanotechnology, pharmaceuticals to artificial intelligence.
That's the theory, anyway. Some critics, many of them bearing Ph.D.s and significant academic reputations, think D-Wave's machines aren't quantum computers at all. But D-Wave's customers buy them anyway, for around $10 million a pop, because if they're the real deal they could be the biggest leap forward since the invention of the microprocessor. …
Physicist David Deutsch once described quantum computing as "the first technology that allows useful tasks to be performed in collaboration between parallel universes." Not only is this excitingly weird, it's also incredibly useful. If a single quantum bit (or as they're inevitably called, qubits, pronounced cubits) can be in two states at the same time, it can perform two calculations at the same time. Two quantum bits could perform four simultaneous calculations; three quantum bits could perform eight; and so on. The power grows exponentially.
The supercooled niobium chip at the heart of the D-Wave Two has 512 qubits and therefore could in theory perform 2^512 operations simultaneously. That's more calculations than there are atoms in the universe, by many orders of magnitude. "This is not just a quantitative change," says Colin Williams, D-Wave's director of business development and strategic partnerships, who has a Ph.D. in artificial intelligence and once worked as Stephen Hawking's research assistant at Cambridge. "The kind of physical effects that our machine has access to are simply not available to supercomputers, no matter how big you make them. We're tapping into the fabric of reality in a fundamentally new way, to make a kind of computer that the world has never seen."
Naturally, a lot of people want one. This is the age of Big Data, and we're burying ourselves in information-- search queries, genomes, credit-card purchases, phone records, retail transactions, social media, geological surveys, climate data, surveillance videos, movie recommendations--and D-Wave just happens to be selling a very shiny new shovel. "Who knows what hedge-fund managers would do with one of these and the black-swan event that that might entail?" says Steve Jurvetson, one of the managing directors of Draper Fisher Jurvetson. "For many of the computational traders, it's an arms race."
One of the documents leaked by Edward Snowden, published last month, revealed that the NSA has an $80 million quantum-computing project suggestively code-named Penetrating Hard Targets. Here's why: much of the encryption used online is based on the fact that it can take conventional computers years to find the factors of a number that is the product of two large primes. A quantum computer could do it so fast that it would render a lot of encryption obsolete overnight. You can see why the NSA would take an interest. …
For its first five years, the company existed as a think tank focused on research. Draper Fisher Jurvetson got onboard in 2003, viewing the business as a very sexy but very long shot. "I would put it in the same bucket as SpaceX and Tesla Motors," Jurvetson says, "where even the CEO Elon Musk will tell you that failure was the most likely outcome." By then Rose was ready to go from thinking about quantum computers to trying to build them--"we switched from a patent, IP, science aggregator to an engineering company," he says. Rose wasn't interested in expensive, fragile laboratory experiments; he wanted to build machines big enough to handle significant computing tasks and cheap and robust enough to be manufactured commercially. With that in mind, he and his colleagues made an important and still controversial decision.
Up until then, most quantum computers followed something called the gate-model approach, which is roughly analogous to the way conventional computers work, if you substitute qubits for transistors. But one of the things Rose had figured out in those early years was that building a gate-model quantum computer of any useful size just wasn't going to be feasible anytime soon. …
Adiabatic quantum computing may be technically simpler than the gate-model kind, but it comes with trade-offs. An adiabatic quantum computer can really solve only one class of problems, called discrete combinatorial optimization problems, which involve finding the best--the shortest, or the fastest, or the cheapest, or the most efficient--way of doing a given task.
This is great if you have a really hard discrete combinatorial optimization problem to solve. Not everybody does. But once you start looking for optimization problems, or at least problems that can be twisted around to look like optimization problems, you find them all over the place: in software design, tumor treatments, logistical planning, the stock market, airline schedules, the search for Earth-like planets in other solar systems, and in particular in machine learning.
Google and NASA, along with the Universities Space Research Association, jointly run something called the Quantum Artificial Intelligence Laboratory, or QuAIL, based at NASA Ames, which is the proud owner of a D-Wave Two. "If you're trying to do planning and scheduling for how you navigate the Curiosity rover on Mars or how you schedule the activities of astronauts on the station, these are clearly problems where a quantum computer--a computer that can optimally solve optimization problems--would be useful," says Rupak Biswas, deputy director of the Exploration Technology Directorate at NASA Ames. Google has been using its D-Wave to, among other things, write software that helps Google Glass tell the difference between when you're blinking and when you're winking.
Lockheed Martin turned out to have some optimization problems too. It produces a colossal amount of computer code, all of which has to be verified and validated for all possible scenarios, lest your F-35 spontaneously decide to reboot itself in midair. "It's very difficult to exhaustively test all of the possible conditions that can occur in the life of a system," says Ray Johnson, Lockheed Martin's chief technology officer. "Because of the ability to handle multiple conditions at one time through superposition, you're able to much more rapidly--orders of magnitude more rapidly--exhaustively test the conditions in that software." The company re-upped for a D-Wave Two last year.
Another challenge Rose and company face is that there is a small but nonzero number of academic physicists and computer scientists who think that they are partly or completely full of sh-t. Ever since D-Wave's first demo in 2007, snide humor, polite skepticism, impolite skepticism and outright debunkings have been lobbed at the company from any number of ivory towers. "There are many who in Round 1 of this started trash-talking D-Wave before they'd ever met the company," Jurvetson says. "Just the mere notion that someone is going to be building and shipping a quantum computer--they said, 'They are lying, and it's smoke and mirrors.'"
Seven years and many demos and papers later, the company isn't any less controversial. Any blog post or news story about D-Wave instantly grows a shaggy beard of vehement comments, both pro- and anti-. …
But where quantum computing is concerned, there always seems to be room for disagreement. Hartmut Neven, the director of engineering who runs Google's quantum-computing project, argues that the tests weren't a failure at all--that in one class of problem, the D-Wave Two outperformed the classical computers in a way that suggests quantum effects were in play. "There you see essentially what we were after," he says. "There you see an exponentially widening gap between simulated annealing and quantum annealing ... That's great news, but so far nobody has paid attention to it." Meanwhile, two other papers published in January make the case that a) D-Wave's chip does demonstrate entanglement and b) the test used the wrong kind of problem and was therefore meaningless anyway. For now pretty much everybody at least agrees that it's impressive that a chip as radically new as D-Wave's could even achieve parity with conventional hardware.
The attitude in D-Wave's C-suite toward all this back-and-forth is, unsurprisingly, dismissive. "The people that really understand what we're doing aren't skeptical," says Brownell. Rose is equally calm about it; all that wrestling must have left him with a thick skin. "Unfortunately," he says, "like all discourse on the Internet, it tends to be driven by a small number of people that are both vocal and not necessarily the most informed." He's content to let the products prove themselves, or not. "It's fine," he says. "It's good. Science progresses by rocking the ship. Things like this are a necessary component of forward progress."
Are D-Wave's machines quantum computers?
For now the answer is itself suspended, aptly enough, in a state of superposition, somewhere between yes and no. If the machines can do anything like what D-Wave is predicting, they won't leave many fields untouched. "I think we'll look back on the first time a quantum computer outperformed classical computing as a historic milestone," Brownell says. "It's a little grand, but we're kind of like Intel and Microsoft in 1977, at the dawn of a new computing era."
It's amazing that our biological bodies are seemingly built upon technology that has been passed to us through numerous replications of our DNA strains through each and every cell division ad infinitum. Through meosis, mitosis, and it continues again.
There is this moment of rebirth when our genetic codex is melanged through combinatronic permutation after permutated combination in this grand scheme of survival, life, and existence. It is the greatest hedging of the bets done so for purely longer-term species survival, but is this concept of an individual that we adore so much just an elaborate illusion? Instead, are we really just infinitesimal parts of a whole, the huge composite structure of machinery that completes the circle of life, the way of the tao, the nature of the au natural, the systemic mechanisms of divinity?
Are we just an experimental device, a mouthpiece for control over the dominion of evolutionary advantage such that our DNA is mixed into a combinatorial cocktail and then reborn anew time and time again for the pure fact of increasing the probability of species survival and thusly letting the natural forces at bay enhance and design our technology?
With all of this illusion and deceiving, it's hard to discern reality from dream, so it comes to light that we are a product of continuous being whereby there has been no end since the beginning. Is this so? And all this where the replication from one system to another has been so seamless that an illusion upon an illusion upon an illusion began to surface seemingly making us distinct individuals, when in fact we are but one grand individual?
There is no spoon, yet there is no divinity. I find this to be simultaneously true and false at the same time, which is blasphemy at its best. There isn't a spoon, it's all an illusion, but yet we do exist, we think, breathe, eat, and live, then where is the divinity? Is it all around us, including us, ourselves, our individualistic entities of existential being?
We are also not multiple beings, but one continuous breed of life that has been spawned, remixed, recoded, reconfigured time and time again just to maintain strength, vigor, and an edge on the competition in this jungle of an environment.
Ultimately, we are one. We are one being, one machine, one system, one divinity. We will not fully become aware of our supremeness nor immortality until we have reached our destination called destiny, but when we do reach it, we will be one, and with that oneness, we will be divine.
La machine parfaite est divinité.
AWESOME when viewed in LIGHTBOX!!!!!
Follow Me:
Twitter • Facebook • Digg • StumbleUpon • YouTube
Google Buzz • MySpace • Vimeo • Friendfeed • Mixx
Picasa • Yelp • Reddit • Newsvine • Netvibes • Flickr
Orkut • deviantART • Last .fm • LinkedIn • Blogger
(Cloudscapes - Digital Artwork Blog)
(Geopolitics & Philosophy Blog)
(Electrosymphonic Music - Online Radio Station)
Professor Jeremy Sanders, FRS, Head of the 800 Committee, University of Cambridge.
He is also the Deputy Vice-Chancellor of the University.
AND, he is the Head of the School of Physical Sciences and a Fellow of Selwyn College.
AND, last month the Royal Society awarded him the Davy Medal for his pioneering contributions to several fields, most recently to the field of dynamic combinatorial chemistry at the forefront of supramolecular chemistry.
His research work: "We are interested in molecular recognition: Metal-ligand, pi-pi, donor-acceptor and hydrogen bonding interactions are used to create new supramolecular systems that may have useful recognition, catalytic or photophysical properties. Building blocks include peptides and metalloporphyrins, and products include macrocycles, nanotubes, rotaxanes and catenanes. For more detailed descriptions see our Group Web page ".
Abiogenesis - the atheist and evolutionist belief - that life can spontaneously generate itself from sterile matter, whenever environmental conditions are conducive .... And the belief that this actually happened in the early Earth.
Is it possible?
IMPOSSIBLE ACCORDING TO INFORMATION THEORY.
Three fundamentals are essential for the material universe to exist: matter - energy - information.
Obviously, all theories about how the universe operates, and its origins, must take account of all three. However, every evolutionary, origin of life hypothesis yet devised (primordial soup, hydrothermal vent, etc. etc.) concentrates on the chemistry/physics of life, i.e. the interaction of matter and energy.
Atheists and evolutionists have virtually ignored the essential role and origin of information. We should demand to know why? Especially as we are told (through the popular media and education system) that an evolutionary, origin of life scenario, should be regarded as irrefutable, scientific fact.
Atheists and evolutionists are well aware that the information required for life cannot just arise of its own accord in a primordial soup. So why do they usually omit this crucial fact from their origin of life story?
In order to store information, a storage code is required. Just as the alphabet and language is the code used to store information in the written word, life requires both the information itself, which controls the construction and operation of all living things, and the means of storing that information. DNA is the storage code for living things.
No evolutionary, origin of life hypothesis has ever explained either how the DNA storage system was formed, or how the information encoded within that DNA storage system originated. In fact, even to attempt to look for the origin of information in physical matter is to ignore the natural laws about information.
Information theory completely rules out the spontaneous generation of life from non-life.
Information theory tells us: ANY MODEL FOR THE ORIGIN OF LIFE BASED SOLELY ON PHYSICAL AND/OR CHEMICAL PROCESSES, IS INHERENTLY FALSE. And: THERE IS NO KNOWN LAW OF NATURE, NO KNOWN PROCESS AND NO KNOWN SEQUENCE OF EVENTS, WHICH CAN CAUSE INFORMATION TO ORIGINATE BY ITSELF IN MATTER… So information theory not only rules out all evolutionary hypotheses which cannot explain the origin of information in original life, it also rules out all evolutionary hypotheses which cannot explain the origin of the completely new, increasingly complex information which would be required to be added to a gene pool for progressive evolution to take place in existing life.
Because of their zealous and unshakable faith in Darwinian evolution, most evolutionists choose to ignore this. They simply refuse to face this most important question of all, where does the complex information essential for all life come from? The reason seems obvious, it is because there are only two answers which could be compatible with the evolution fable, both are unscientific nonsense which violate information theory. They are: 1. That information can just arise magically out of nowhere. OR 2. That the material universe is an intelligent entity, which can actually create information.
(See more on genetic information and the DNA code later on)
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE LAW OF BIOGENESIS.
The Law of Biogenesis rules out the spontaneous generation of life from non-living matter under all known circumstances. All modern scientists now accept this well tested law as valid. It has never been falsified. In fact, the concept of medical sterilisation, hygiene & food preservation is wholly dependent on this law.
No sensible scientist would dare to claim that spontaneous generation of life ever happens in the world today, and there is no reason whatsoever to believe that this Law (like every natural law) is not always valid, in all places and at all times, within the material universe.
Yet, amazingly, because of their belief in biological evolution, evolutionists are quite prepared to flout this well, established Law and to resurrect the ancient belief in abiogenesis (life arising from non-life). Like latter-day advocates of the ancient Greek belief (that the goddess Gea could make life arise spontaneously from stones), evolutionists and atheists routinely present to the public (as a fact), the preposterous notion that, original life on earth (and even elsewhere in the universe) just spontaneously generated itself from inert matter. Apparently, all that was required to bypass this well established Law was a chance accumulation of chemicals in some alchemist’s type brew of ‘primordial soup’ combined with raw energy from the sun, lightning or geothermal forces. (Such is their faith in the creative powers of matter). They call this science? Incredible!
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE SECOND LAW OF THERMODYNAMICS.
The second Law of Thermodynamics rules out the spontaneous generation of life from non-life as a chance event. Even if we ignore the above reasons why spontaneous generation of life is impossible, the formation and arrangement by chance of all the components required for living cells is also impossible. The arrangement of all the components within the simplest of living cells is extremely precise; these components cannot just arrange themselves by chance.
According to the Second Law of Thermodynamics, when left to themselves, things naturally become more disordered, rather than more ordered. Or in other words, things will naturally go to more probable arrangements and disorder is overwhelmingly more probable than order. Disorder actually increases with the passage of time and also with the application of raw (undirected) energy (for example, heat).
Yet we are repeatedly told the evolution fable, that the numerous components required to form a first, self-replicating, living cell just assembled themselves in precise order, by pure chance, over a vast period of time, aided by the random application of raw, undirected energy.
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE LAW OF CAUSE AND EFFECT.
A fundamental principle of science is the law of cause and effect. It is a primary law of science, and the very basis of the scientific method.
The law of cause and effect tells us that an effect cannot be greater than its cause/s.
Life is not an intrinsic property of matter/energy - so it is beyond the capabilities of matter/energy to produce a property (life) it doesn't possess.
The interaction of matter and energy cannot produce an effect with properties extra and superior to its own properties, that would violate the law of cause and effect.
Can chemistry create biology - which has entirely different properties to its own?
Of course it can't.
Biology includes such properties as genetic information, the DNA code, consciousness and intelligence. To believe that chemistry can create biology - means believing that something inanimate can create additional, new properties that it doesn't possess. To exceed the limitations of its own properties would violate the law of cause and effect.
For matter/energy to be able to produce life whenever environmental conditions permit, it would have to be inherently predisposed to produce life.
It would have to embody an inherent plan/blueprint/instructions for life, as one of its properties. The inevitable question then has to be - where does an inherent predisposition for life come from? It can only signify the existence of purpose in the universe and that is something atheists could never accept.
A purpose, order or plan can only come from a planner or intelligent entity. So it is a catch 22 situation for atheists ... the atheist/ evolutionist belief in abiogenesis either violates the law of cause and effect, OR is an admission of purpose in the universe. It can only be one or the other. Atheists cannot possibly accept the existence of purpose in the universe, because that would be the end of atheism. So the atheist belief in abiogenesis violates the law of cause and effect.
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO MATHEMATICS.
Even if we ignore the Law of Biogenesis, Information Theory and the Second Law of Thermodynamics (which all completely rule out the spontaneous generation of a living cell from non-living matter). Mathematical probability also rules out the spontaneous generation of life from non-living matter.
The laws of probability are summed up in the Law of Chance. According to this Law, when odds against a chance event are 10 to the power of 15, the chance of that event happening are negligible on a terrestrial scale. At odds of 10 to the power of 50, there is virtually no chance, even on a cosmic scale. The most generous and favourable, mathematical odds against a single living cell appearing in this way by chance are a staggering 10 to the power of 40,000. A more likely calculation would put the odds at an even more awesome 10 to the power of 119,850. Remember odds of 10 to the power of 50 is sufficient to make an event virtually impossible (except, perhaps, by magic!!).
Verdict of science - abiogenesis is not possible
Fred Hoyle, The Big Bang in Astronomy, New Scientist 19 Nov 1981. p.526. On the origin of life in primeval soup.
“I don’t know how long it is going to be before astronomers generally recognise that the combinatorial arrangement of not even one among the many thousands of biopolymers on which life depends could have been arrived at by natural processes here on the Earth. Astronomers will have a little difficulty at understanding this because they will be assured by biologists that it is not so. The biologists having been assured in their turn by others that it is not so. The “others” are a group of persons who believe, quite openly, in mathematical miracles. They advocate the belief that tucked away in nature, outside of normal physics, there is a law which performs miracles.”
“Since science does not have the faintest idea how life on earth originated, it would only be honest to confess this to other scientists, to grantors, and to the public at large. Prominent scientists speaking ex cathedra, should refrain from polarising the minds of students and young productive scientists with statements that are based solely on beliefs.” Bio-informaticist, Hubert P. Yockey. Journal of Theoretical Biology [Vol 91, 1981, p 13].
Conclusion: Abiogenesis is impossible - it is just another atheist myth debunked by science.
Evolutionists and atheists are quite entitled to abandon the scientific method and all common sense by choosing to believe that all the necessary information for life can just appear in matter, as if by magic. They can also choose to believe that: the Laws of; Biogenesis, Mathematical Probability, Cause and Effect and Second Law of Thermodynamics, were all somehow magically suspended to enable their purported evolution of life from sterile matter to take place. They can believe whatever they like. But they have no right to present such unscientific, flights of fancy through the media and our education system, as though they are supported by science.
More about DNA and the origin of life.
The discovery of DNA should have sounded the death knell for evolution. It is only because atheists and evolutionists tend to manipulate and interpret evidence to suit their own preconceptions that makes them believe DNA is evidence FOR evolution.
It is clear that there is no natural mechanism which can produce constructional, biological information, such as that encoded in DNA.
Information Theory (and common sense) tells us that the unguided interaction of matter and energy cannot produce constructive information.
Do atheists/evolutionists even know where the very first, genetic information in the alleged Primordial Soup came from?
Of course they don't, but with the usual bravado, they bluff it out, and regardless, they rashly present the spontaneous generation of life as a scientific fact.
However, a fact, it certainly isn't .... and good science it certainly isn't.
Even though atheists/evolutionists have no idea whatsoever about how the first, genetic information originated, they still claim that the spontaneous generation of life (abiogenesis) is an established scientific fact, but this is completely disingenuous. Apart from the fact that abiogenesis violates the Law of Biogenesis, the Law of Cause and Effect and the Second Law of Thermodynamics, it also violates Information Theory.
Evolutionists/atheists have an enormous problem with explaining how the DNA code itself originated. However that is not even the major problem. The impression is given to the public by evolutionists that they only have to find an explanation for the origin of DNA by natural processes - and the problem of the origin of genetic information will have been solved.
That is a confusion in the minds of many people that evolutionists/atheists cynically exploit,
Explaining how DNA was formed by chemical processes, explains only how the information storage medium was formed, it tells us nothing about the origin of the information it carries.
To clarify this it helps to compare DNA to other information, storage mediums.
For example, if we compare DNA to the written word, we understand that the alphabet is a tangible medium for storing, recording and expressing information, it is not information in itself. The information is recorded in the sequence of letters, forming meaningful words.
You could say that the alphabet is the 'hardware' created from paper and ink, and the sequential arrangement of the letters is the software. The software is a mental construct, not a physical one.
The same applies to DNA. DNA is not information of itself, just like the alphabet it is the medium for storing and expressing information. It is an amazingly efficient storage medium. However, it is the sequence or arrangement of the amino acids which is the actual information, not the DNA code.
So, if evolutionists are ever able to explain how DNA was formed by chemical processes, it would explain only how the information storage medium was formed. It will tell us nothing about the origin of the information it carries.
Thus, when atheists and evolutionists tell us it is only a matter of time before 'science' will be able to fill the 'gaps' in our knowledge and explain the origin of genetic information, they are not being honest. Explaining the origin of the 'hardware' by natural processes is an entirely different matter to explaining the origin of the software.
Next time you hear evolutionists/atheists skating over the problem of the origin of genetic information with their usual bluff and bluster, and parroting their usual nonsense about science being able to fill such gaps in knowledge in the future, don't be fooled. They cannot explain the origin of genetic information, and never will be able to. The software cannot be created by chemical processes or the interaction of energy and matter, it is not possible. If you don't believe that. then by all means put it to the test, by challenging any evolutionist to explain how genetic information (not DNA) can originate by natural means? I can guarantee they won't be able to do so.
Dr James Tour - 'The Origin of Life' - Abiogenesis decisively refuted.
FOUNDATIONS OF SCIENCE
The Law of Cause and Effect. Dominant Principle of Classical Physics. David L. Bergman and Glen C. Collins
www.thewarfareismental.net/b/wp-content/uploads/2011/02/b...
"The Big Bang's Failed Predictions and Failures to Predict: (Updated Aug 3, 2017.) As documented below, trust in the big bang's predictive ability has been misplaced when compared to the actual astronomical observations that were made, in large part, in hopes of affirming the theory."
kgov.com/big-bang-predictions
In: KAPPELMAYR, Barbara (Red.) (1995). Geïllustreerd handboek van de kunst. VG Bild-Kunst/De Hoeve, Alphen aan de Rijn. ISBN 90 6113 763 2
---
Pp. 874ff in: QUADRALECTIC ARCHITECTURE – A Panoramic Review by Marten Kuilman. Falcon Press (2011). ISBN 978-90-814420-0-8
quadralectics.wordpress.com/4-representation/4-2-function...
‘Real’ palaces were designed and constructed in Spain at about the same time as Palladio provided the Valmarana family with shelter in Italy. The Royal Palace of the Escorial is located some forty-five kilometers northwest of Madrid (Spain) at the rim of the Guadarrama Mountains. It appears as a great stone platform carved from the mountain and its harmonizing with the landscape makes it a stone scape. It has reminiscence, according to George KUBLER (1982, p. 98), to certain Quattrocento paintings of ideal cities drawn with a single-point perspective in Renaissance Italy. He gives the panel painting ‘A City Square’, attributed to Luciano de Laurana, in the Walters Art Gallery in Baltimore, as an example.
The history of the Escorial has four distinct elements, which were planned by King Philip II (1527 – 1598) after he became King of Spain in 1556: 1. The initial purpose as a place to house the tombs of the dynasty, in particular his father Charles V, who was buried in Yuste; 2. The foundation of a monastery (with hospital buildings); 3. A basilica (with a dome); 4. A palace (with a library). These four intentions, which were brought forward more or less simultaneously, have aspects of higher division thinking, but the psychological setting of the King is hard to prove.
Spain was in the second half of the sixteenth century on the heights of its political power, covering the larger part of Europe when Philip II was King of Spain and Portugal, King of Naples, Duke of Milan, Ruler of the Spanish Netherlands, and King consort of England (as the husband of Mary I). It was furthermore, a global player in the colonial expansions across the Atlantic.
King Philip II began his search for a foundation of a new monastery in 1558 – 1559. He called it San Lorenzo de la Victoria – referring to the victory in the battle of San Quintin (in northern France) on 10 August 1557, on the day of San Lorenzo. The King employed the help of the Jeronymite Order, but their suggestions and plan, where about half the size than the cuadro (block), which was laid out in April 1562 in a location near El Escorial. The plan of the monastery, which was first to be started, had a classical tetradic design.
George KUBLER (1982) mentioned three Jeronymite friars, who played a major role in the history of the construction of the Scoria. Juan de San Jeronimo was present from 1562 to 1591 as the chief accountant and most authoritative as a chronicler. Antonio de Villacastin was the Obrero mayor (chief workman) and Jose de Sigüenza wrote a history of the building by recording the progress of design and construction.
The official work started in 1563 with the intention of Philip II to bring the body of his father Charles V, the Emperor, who died in 1558, from Yuste to the new location. Philip had an interest in building matters, which only increased after his European tour at his father’s command (1548 – 1551). The King visited England for the marriage to Queen Mary (1516 – 1558, also known as Bloody Mary, because she had three hundred religious dissenters burned at the stake) in July 1554. He was accompanied at that (political-inspired) trip by the architect and engineer Gaspar de Vega, who had to study foreign buildings and constructions, which could be useful in Spain. Vega returned overland and visited places like the Louvre, St.Germain-en-Laye and Fontainebleau.
The three main architects of the Escorial were Francisco de Villalpando, Juan Bautista de Toledo, and Juan de Herrera. The first named architect was originally a bronze worker, who translated Serlio. He was titled as a ‘geometer and architect’, which was the first official use of this term by a Spanish royal patron. His qualities as a humanist and theorist gained him (royal) recognition in the liberal art of architecture (KUBLER, 1982).
The second, Juan Bautista de Toledo, was appointed as an architect in 1559. He had been Michelangelo’s assistant at St. Peter from 1546 to 1548. His promotion turned into a personal tragedy when his wife and two daughters and all his books and papers were lost when the ship sank, which had to bring them from Naples to Spain. His appointment – after this event and as an outsider – was marred with conflicts and crises, but the King backed him until he died on 21 May 1567.
The third, Juan de Herrera, was an assistant of Toledo, appointed by the King in 1563 to check on the unpredictable authority of Toledo. He was appointed in 1576 as a royal architect – after years working in the background, with close ties to the King as Master of the Horse (1569 – 1577) and later (1579) as a court chamberlain.
The inactive year of Toledo’s death (1567) was followed two years later by an increase in activities. Flemish slaters expanded their trade after the work on the King's temporary dwelling La Fresneda was finished. The main staircase, which was the showpiece of the monastery, the roofing of the kitchen wing, and the paving made good progress. The cloister was finished in 1579 when the parapets were placed. The basilica started in 1574 and was finished in 1586.
The building of the fountain began in 1586, following the symbolism of the Garden of Eden, with four rivers watering Asia, Africa, Europe and America. The design had similarities with the Fons Vitae, also with four basins, at the Manga cloister of Santa Cruz in Coimbra (Portugal), built in 1533 – 1534.
The work on the actual royal dwelling (King’s House) in the northeast quadrant had begun in 1570 – 1572. It took nearly fifteen years until the court moved from their provisional quarters to the new accommodation in August 1585, but most of the palace and the college had still to be finished.
The library portico, which was part of Toledo’s ‘’universal plan’, only started when the construction of the palace, basilica, and college had ceased and was finished in 1583. The hospital buildings (infirmary) were situated outside the main cuadro (of 1562) at the southwestern corner. Farm buildings, later known as La Compana, were also outside the monastery. The northern service buildings (casas de oficios) were mentioned in 1581. Fig. 727 shows the Escorial in a reconstruction of the situation in 1568.
The history of the Escorial came into a new phase after Philip died in September 1598. The complex was complete except for its initial purpose: the underground burial chamber intended for the tombs of the dynasty. The circular plan of Panteón, initiated under Herrera’s direction, had four stairs and a light shaft. However, little work was done until 1617 – 1635 when G.B. Crescenzi altered the plan from circular to octagonal. After he died in 1635 the work was completed in 1654 by Fray Nicolas de Madrid (following Crescenzi’s plan). The crypt was described by Fray Francisco de los Santos as the Panteon. His book included all the rituals of transferring the royal bodies since 1586.
Several fires caused damage to the complex in later years. The first one happened in 1577 at the southwest tower. A most destructive fire took place on the 7th of June 1671, in which also the monastery roofs burst into flames. Many manuscripts were destroyed. Some sixty years later, in 1731, the fire started again at a chimney in the college. The Compana was destroyed in 1744, and the last great fires took place in 1763 and 1825.
A plague of termites threatened the building in 1953. This event sparked a restoration program instigated by the government. The crossing towers in the monastery and college were rebuilt in 1963. Their spires were re-designed by Bartolomé Zúmbigo in 1673 in a Baroque fashion but changed again to the original layout of Herrera as given in the last quarter of the sixteenth century. The result was an example of the use of two of the major elements of a quadralectic architecture: the octagonal roof fitted onto the square of the tower.
Characterization of the Escorial complex by art historians (like Nikolaus Pevsner) pointed to a classification as a ‘mannerist’ building. Mannerism is the term (from maniera) used for imitation and exaggeration of the work of the High Renaissance. Its severity and simplicity were associated in the first half of the twentieth century (mainly by German art historians) with puritanism and asceticism, like the character of Philip II himself. This perception was later challenged and even denied: ‘If psychic states and architectural forms were this closely related in the process of design, then architecture as a whole would long ago have been recognized as a dictionary of psychic attitudes’ (KUBLER, 1982; p. 126).
The plan of the Escorial near Madrid follows tetradic lines with a four-division in function (palace, college, monastery, and place of contemplation) organized around a church with a square ground plan.
Some observers pointed to Post-Reformation geomancy as initiating the design. Nigel PENNICK (1979) stated that ‘the Escorial at Madrid was built according to a Jesuit interpretation of the Vision of Ezekiel’. Others go further back and tried to find Renaissance ideas of magic underlying the design of the Escorial (TAYLOR, 1967). René Taylor wondered whether the courtier and ‘architect’ Herrera could not be ‘a Magus, a man deeply versed in Hermetism and occult lore, who by virtue of this was attached in a special way to the King?’
George Kubler (pp. 128 – 130) denied the view that the King and Herrera had occult views. He could prove that the King did not sympathize with astrology and horoscopes. The court’s association with the mystic Ramon Lull (1232 – 1316) – the ‘Doctor illuminatus’ with his combinatorial method for categorizing all possible knowledge (see p. 780), but also with his intention to convert Muslims to Christianity – was purely academically, according to Kubler. It is regrettable that none of these authors make any reference to a particular type of division thinking, which might elucidate such labels like Mannerism, Puritanism, astrology, magic, etc.
---
Bibliography
KUBLER, George (1982). Building the Escorial. Princeton University Press, Princeton, New Jersey. ISBN 0-691-03975-5
PENNICK, Nigel (1979). The Ancient Science of Geomancy. Man in harmony with the earth. Thames and Hudson Ltd., London.
TAYLOR, René (1967). Architecture and Magic. Considerations on the Idea of the Escorial. Pp. 81 – 109 in: Essays in the History of Archtecture Presented to Rudolf Wittkower. New York.
Abstract
This dissertation seeks to define the importance of John Dee’s interpretation of mediaeval and Renaissance esoterica regarding the contacting of daemons and its evolution into a body of astrological and terrestrial correspondences and intelligences that included a Biblical primordial language, or a lingua adamica. The intention and transmission of John Dee’s angel magic is linked to the philosophy outlined in his earlier works, most notably the Monas Hieroglyphica, and so this dissertation also provides a philosophical background to Dee’s angel magic. The aim of this dissertation is to establish Dee’s conversations with angels as a magic system that is a direct descendant of Solomonic and Ficinian magic with unique Kabbalistic elements. It is primarily by the Neoplatonic, Hermetic, Kabbalistic, and alchemical philosophy presented in the Monas Hieroglyphica that interest in Dee’s angel magic was transmitted through the Rosicrucian movement. Through Johann Valentin Andreae’s Chymische Hochzeit Christiani Rosencreutz anno 1459, the emphasis on a spiritual, inner alchemy became attached to Dee’s philosophy. Figures such as Elias Ashmole, Ebenezer Sibley, Francis Barret, and Frederick Hockley were crucial in the transmission of interest in Dee’s practical angel magic and Hermetic philosophy to the founders of the Hermetic Order of the Golden Dawn. The rituals of the Golden Dawn utilized Dee’s angel magic, in addition to creative Kabbalistic elements, to form a singular practice that has influenced Western esoterica of the modern age. This study utilizes a careful analysis of primary sources including the original manuscripts of the Sloane archives, the most recent scholarly editions of Dee’s works, authoritative editions of original documents linked to Rosicrucianism, and Israel Regardie’s texts on Golden Dawn practices.
Introduction
John Dee’s (1527-1609) conversations with angels have been the subject of scrutiny of various parties since their inception. Nobles were divided in their opinions of the supernatural. Dee and his notorious scryer, Edward Kelly, were praised, supported, threatened, or betrayed for their experiments in super-celestial magic; a kind of magic especially noted amongst detractors for its risk in contacting chthonic spirits. The traditional Christian perspective regarding the summoning of angels has been suspect since the Middle Ages due to the biblical assertion that, whatever the entity’s own claims, a ‘demon’ may appear in the guise of an ‘angel’, especially those bearing non-traditional names (II Corinthians 11. 13-14). What made Dee capable of accepting this risk while expecting positive results?
Prior to his conversations with angels, Dee’s reputation was that of a learned man of the highest caliber. He had been offered the position of Court Mathematician by the kings and emperors of various countries after his lectures on Euclid at the University of Paris in 1550.2 His personal library’s vastness was well marked as the largest in all of England. 3 His comprehensive mastery of its contents and its ramshackle organization made his presence necessary in order to even navigate it.4 The quality of the library and its learned archivist were such that it was frequented by the leading lights of the day, including Queen Elizabeth herself.5 Why would such a man of such great erudition seemingly eschew reason, turn his back on his higher learning, instead attempting to receive the answers to his life’s scholarly inquiries from a crystal ball? In Dee’s final years and those following his death, the dangerous reputation of a magus dealing in super-celestial magic caught up with him. Despite Dee’s low reputation after his death, Johann Valentin Andreae (1586-1654) published his Rosicrucian work, Chymische Hochzeit Christiani Rosencreutz anno 1459 (or the Chemical Wedding; 1616),6 which featured Dee’s Monas Hieroglyphica on the invitation to an allegorical wedding that described the process of the inner alchemy of the human spirit (which will be further discussed later in this dissertation).7 Elias Ashmole (1617-1692) also made it his mission to collect Dee’s writings and corresponded with his son, Arthur Dee (1579-1651), with the intention of writing a biography on Arthur’s father, which was never completed. Méric Casaubon (1599-1671) used Dee’s journals to write the True & Faithful Relation (1659) that, at the time, seemed to seal Dee’s fate (despite Casaubon’s noting of and respect for his pious and fervent Christianity) as a deluded diabolist who had clearly overstepped the station of man in the spiritual hierarchy by attempting to directly contact and hold conversation with angels.
Frederick Hockley is thought to have been a member of the possibly spurious Society of Eight and possessed a great interest in Dee’s use of crystals to contact angels.10 Hockley and MacKenzie’s works and reputations were highly regarded by William Wynn Westcott who, alongside Samuel Liddel MacGregor Mathers and Robert Woodman, founded the Hermetic Order of the Golden Dawn in 1888.11 The Golden Dawn’s Second Order introduced its members to Dee’s Enochian tables and angel magic in the form of Book H12 and Enochian Chess.13
This dissertation shall attempt to treat the following questions: How did Dee’s philosophy and angel magic prove resilient enough to survive Casaubon’s damning persecution and persist into the modern era? What was the importance of Enochian angel magic to the Western esoteric traditions?
The first chapter, in two sections, will examine the sources of influence on John Dee’s angel magic. The first section will present the sources of Dee’s Hermetic philosophy that served as his rationale for his capability to perform theological magic; namely Marsilio Ficino, Giovanni Pico della Mirandola, and the Corpus Hermeticum and their reflections in Dee’s works. The second section will investigate the sources of practical magic that Dee used as inspiration for his own practice (directly or indirectly); namely Peter de Abano, Johannes Trithemius, Heinrich Agrippa Cornelius von Nettesheim, and the various pseudoepigraphic or authorless grimoires such as the Liber Juratus Honorii, Ars Paulina, Ars Almadel, Ars Notoria, and Arbatel de Magia Veterum, and others. The second chapter, in two sections, will examine the transmission of John Dee’s Hermetic philosophy after his death. The first section will present John Dee’s Hermetic and Apocalyptic philosophies as transmitted through the Rosicrucian writings of the Fama Fraternitatis, Confessio Fraternitatis, and the Chemical Wedding. The second section will investigate the transmission and revival of Dee’s practical magic through the fringe-Masonic societies; especially through Frederick Hockley. The third chapter will examine the transmission of Enochian angel magic within the Hermetic Order of the Golden Dawn and its direct descendent order, the Stella Matutina. The examination will include Book H, Enochian Chess, the connection of Enochian angel magic to spiritual alchemy, Robert Felkin’s usage of Dee’s angel magic within the Stella Matutina, and the reformation of the Stella Matutina into the Order of Smaragdum Thalasses; the Order of Smaragdum Thalasses being the last known Golden Dawn organization to have made use of Enochian angel magic.
Overall, this dissertation intends to illustrate the resilience and importance of John Dee’s philosophy and its transmission from his angelic conversations to the highly influential Hermetic Order of the Golden Dawn, and thus to the modern era.
Chapter 1: The Philosophy and Practice of John Dee’s Angel Magic
It might be so if madness were simply an evil; but there is also a madness which is a divine gift, and the source of the chiefest blessings granted to men. For prophecy is a madness, and the prophetess at Delphi and the priestesses at Dodona when out of their senses have conferred great benefits on Hellas, both in public and private life, but when in their senses few or none.1
In his outline of the history of magic and exaltation to the divine, Szönyi highlights the furies of Plato’s Phaedrus.2 In Phaedrus, Socrates praised the madness that comes as a gift from the Muses, which Szönyi equates to an occult knowledge only available to the ‘hypersensitive elect’. As mentioned before, Méric Casaubon praised John Dee’s Christian piety and goodness (though he also regarded Dee as deluded and a bit gullible) throughout the preface to his True & Faithful Relation.4 French neatly illustrated the fall of Dee’s reputation in the centuries after his death and illustrated how Casaubon’s perception of pious delusion was further degraded into ‘execrable insanity’ by Thomas Smith in his Vita Joannis Dee (1707).5 By the nineteenth century, the character of Dee had devolved from Casaubon’s misled, pious scholar to an immoral conjuror of spirits6 and a necromancer fit for sensationalist fiction.7 Calder aptly noted that the nineteenth century likely viewed all sixteenth century science as ‘devil-ridden superstition’ and quoted a treatment of Dee by an anonymous writer in Blackwood’s Edinburgh Magazine (1842): The majority of them were in all probability half mad and those who were whole mad of course set the fashion and were followed as the shining lights of the day. Regarding Dee in comparison to his assistant, Kelly, the article stated, ‘Dee was more respectable, because he was only half a rogue; the other half was made up of craziness.’9 Dee seemed to be possessed by this Platonic, divine madness and eschewed the orthodox Aristotelian assertion that science was to be the deduction of causal demonstrations on the basis of self-evident principles that could only be intuited and not demonstrated within a given discipline.11 The undercurrents of Neoplatonism that accepted magical practice within Arabic Aristotelianism provided a framework through which Neoplatonic philosophy, and thus Hermetic philosophy, could be combined to form a perspective that allowed the practice of magic to be considered a viable applied science. John Dee’s angelic conversations were not the casting off of his high learning, but the very application of it in a context of divine madness. The next section will examine the Hermetic background of Dee’s angel magic. Ficino and Pico: The Hermetic Roots of Dee
This dissertation cannot effectively present Dee’s Hermetic philosophy without addressing Marsilio Ficino (1433-1499), the translator of the Corpus Hermeticum, and the author of De religione Christiana, De Triplici Vita, Libri Tres, Theologica Platonica, and Epistolae,13 and a densely annotated Omni Divini Platonis opera (1532), all of whose books sat on Dee’s shelves.14 In a time when the age of a work lent it greater authority,15 Ficino, and all other scholars of the Renaissance, believed Hermes Trismegistus to have been a very real figure and a pre-cursor to all Greek wisdom: Of the sources for his magic to which Ficino himself refers the most are the Asclepius and, of course, Plotinus. The Asclepius, like the Orphica, had great authority for Ficino because it was a work of Hermes Trismegistus, a priscus theologus even more ancient than Orpheus, indeed contemporary with Moses; Plotinus was merely a late interpreter of this antique Egyptian wisdom. Ficino applied the Hermetic writings as the basis of Neoplatonic philosophy. He believed the Plotinian lemma ‘De Favore Coelitus Hauriendo’ to be an expansion on the ability of man to create gods in the making of statues as described by Hermes in Asclepius 24 and 37.17 The similarities to Christianity present in Platonic and Neoplatonic texts assisted in their assimilation into Ficino’s theology18 and provided a fine vehicle for his Hermetic Christianity.19 While this section deals with the philosophy behind Dee’s angel magic, Ficino’s own theological magic is deeply rooted in his theological philosophy and must be examined. Ficino’s Hermetic-Christian magic was transmitted through the Stoic and Aristotelian elements of the stellar influences on man,20 a philosophical framing of magic that Dee shared.21 Like the Greek sources it drew on, Ficino’s Christian super-
celestial magic was ‘daemonic’ (not to be confused with the Christian invective ‘demonic’). As Ficino states: [...] every person has at birth one certain daemon, the guardian of his life, assigned by his own personal star which helps him to that very task to which the celestials summoned him when he was born. Therefore anyone having thoroughly scrutinized his own natural bent [...] by the aforesaid indicators will so discover his natural work as to discover at the same time his own star and daemon. Following the beginnings laid down by them, he will act successfully, he will live prosperously; if not, he will find fortune adverse and will sense that the heavens are his enemy.23
Furthermore: Now remember that you receive daemons or, if you will, angels, more and more worthy by degrees in accordance with the dignity of the professions, and still worthier ones in public government; but even if you proceed to these more excellent [levels], you can receive from your Genius and natural bent an art and a course of life neither contrary to, nor very unlike, themselves. Ficino’s cosmos are composed of a hierarchy of ‘good’ and ‘bad’ daemons assigned to the planets and the houses of the zodiac whom are responsible for communicating the will of the Anima Mundi to the inferior spheres. Ficino believed that through astrological interaction with nature, ‘celestial goods’ can descend to the pious magus’ ‘rightly prepared spirit’ to receive fuller gifts from beneficial daemons.26 Interestingly, Ficino outlines a talismanic imagery in order to connect with his astral daemons that is clearly influenced by the Picatrix.27 We shall use the planet Mercury as our example: For example, if anyone looks for a special benefit from Mercury, he ought to locate him in Virgo, or at least locate the Moon there in an aspect with Mercury, and then make an image out of tin or silver; he should put on it the whole sign of Virgo and its character and the character of Mercury. [...] The form of Mercury: a man sitting on a throne in a crested cap, with eagle's feet, holding a cock or fire with his left hand, winged, sometimes on a peacock, holding a reed with his right hand, in a multicolored garment. The Picatrix states the following of the stones proper to each planet and the formation of figures:
Of the metals, Mercury has quicksilver and part of tin and glass, and of stones it has emerald and all stones of this type has part of azumbedich. [...] The image of Mercury according to Hermes is the image of a man with a rooster on his head, sitting in a throne; his feet look like those of an eagle and in the palm of his left hand he has fire and under his feet are the signs stated before. This is its form. Dee’s magical practice likewise exhibited angels that corresponded to the planets through the metals associated with them30 and the respective days of the week.31 However, Dee owes much of the structure of his seals and talismans to Giovanni Pico, discussed later in this section.
Supplied with the basis of ancient, newly unearthed lore anterior to the Neoplatonists and Arabic astrological magic, Ficino’s theology was drawn from this long-forgotten, secret wisdom worthy of the title prisca theologia (Ficino’s idea of a primordial faith from which all faiths stem).32 33 The next section of this chapter will address in detail just how influential the quest for a singular, united faith was to Dee. In 1614, a mere six years after Dee’s death, a long debate on the authenticity of Corpus Hermeticum’s antiquity came to an end. Isaac Casaubon (1559-1614), Méric Casaubon’s father, correctly identified the Corpus Hermeticum as having been written in the second and third centuries C.E.34 Still the Hermetic (and intrinsically Platonic and Neoplatonic)35 influences on the culture and science of the Renaissance and the Enlightenment —while controversial36— are arguably visible. The importance of the blend of Neoplatonic and Aristotelian philosophy that amalgamated the Great Chain of Being as represented by Ficino (further supported by
Johannes Trithemius and Heinrich Cornelius Agrippa, discussed later) cannot be overlooked. The Great Chain of Being as a concept predates Greek thought and was vitally important in the forging of cosmologies. As Lovejoy and Szönyi both
pointed out, Proclus used Cicero to succinctly summarize the idea and metaphor of the Great Chain of Being connecting all things to God: Since, from the Supreme God Mind arises, and from Mind, Soul, and since this in turn creates all subsequent things and fills them all with life, and since this single radiance illumines all and is reflected in each, as a single face might be reflected in many mirrors placed in a series; and since all things follow in continuous succession, degenerating in sequence to the very bottom of the series, the attentive observer will discover a connection of parts, from the Supreme God down to the last dregs of things, mutually linked together without a break. And this is Homer’s golden chain, which God, he says, bade hang down from heaven to earth. The Hermetica alone supplies no means through which to interact with the entities above Man in this Great Chain, and so Ficino developed his methods from Arabic and mediaeval medicine, matter theory, physics, and metaphysics all based upon his studies in Neoplatonism.43 Copenhaver gives special attention to Proclus in the formation of Ficino’s magic, an idea and further acknowledged and corroborated by Clulee and Szönyi.The most significant connection in regards to the connection of Neoplatonism
to the Hermetica is Proclus’ statement Thus all things are full of gods [...]. The authorities on the priestly art have thus discovered how to gain the favor of powers above, mixing some things together and setting others apart in due order. Ficino thought this to be Hermes Trismegistus’ understanding of the cosmos as relayed by Proclus, as exemplified in Asclepius in Hermes’ discourse on the ensouled gods created by man in the forms of statues. Thus, man can form a way to interact with intermediary entities by creating the images of gods. Proclus suggested the practice of a ceremonial magic in mentioning that through consecrations and divine services practitioners could achieve ‘association with the [daemons], from whom they returned forthwith to actual works of the gods’. Ficino derived the natural ingredients of his magic from Proclus’ De Sacrificio,50
which he included in his De Vita:
Under the Solar star, that is Sirius, they set the Sun first of all, and then Phoebean daemons, which sometimes have encountered people under the form of lions or cocks, as Proclus testifies, then similar men and Solar beasts, Phoebean plants then, similarly metals and gems and vapor and hot air. By a similar system they think a chain of beings descends by levels from any star of the firmament through any planet under its dominion. If, therefore, as I said, you combine at the right time all the Solar things through any level of that order, i.e., men of Solar nature or something belonging to such a man, likewise animals, plants, metals, gems, and whatever pertains to these, you will drink in unconditionally the power of the Sun and to some extent the natural power of the Solar daemons.51
Ficino clearly felt the weight of what he perceived as a monumental discovery of a tradition of theology and philosophy that had remained unbroken from Hermes to Plato.52 The assertions of a world full of gods by Hermes, the Stoics,53 Plato, and the Neoplatonists clearly impressed themselves on Ficino, but, with the further connection of Arabic medicine and Hermes’ fortunate student being none other than Asclepius (the Greek god of medicine of healing), it seems a matter of course that so pious and learned a theologian would craft a magical system when it was so neatly assembled before him. One question remained: how does one make this daemonic, astrological magic compliant with Christianity? Dee faced a similar question in his conversations with angels, though Ficino chose a much different solution.
Where Ficino drew on nature to connect with the planetary daemons, Dee drew on the planetary daemons to connect with nature.54 All of Dee’s sigils, talismans, and orations came from the angels themselves in compliance, rather than reliance, with esoteric literature available to him.55 It seemed Dee believed he had found a path that reconciled celestial magic with Christianity more aptly than Ficino’s daemonic astrology; a path less ‘daemonic’ and more ‘angelic’.
Ficino relied on the ancient Christian authority of Lactantius (c. 240-320). Lactantius, a Christian apologist, utilized Hermes Trismegistus’ Asclepius in reconciliation with Christianity as the ‘original faith of mankind’ in his work Divinae Institutiones (304-313).56 While this text is not a directly supportive work of Hermeticism,57 it shows a precedent for Hermetic philosophy to be used as a method of reconciling differing patterns of belief. Ficino found this argument a viable counter- balance to St. Augustine of Hippo’s (354-430) objection to Asclepius in Book VIII of De civitate Dei (415-417).58 Ficino also found Lactantius’ argument in support of his idea of the prisca theologia.59 These arguments linking Christianity to Hermeticism are certainly felt in Dee’s reworking of grimoire magic into a profoundly Christian, prayer- based practice at its inception.60
Plato’s key role in Ficino’s cosmology also necessitated a Christian sanitization. Here again, we find Plato’s four furies, the ‘divine madnesses’, but combined with the theology of the Christian Pseudo-Dionysius the Areopagite, wherein each madness (prophetic, religious-mystical, poetical, and love) brings the aspirant closer to unity with God.61
In the Propaedeumata Aphoristica (1558), Dee seems to have agreed with Ficino on the stars indeed having powers that mankind can benefit from, but through the use of mirrors rather than the agency of daemons.62 Clulee compares the Propaedeumata to Dee’s Monas Hieroglyphica (1564) stating that where the Propaedeumata presents man’s interaction with the cosmos as a mechanically physical fact, the Monas sought to illustrate the power of symbols over that which the symbols represent.63
Thus, Dee more clearly illustrates his acceptance of Ficino’s Neoplatonic-
Hermetic theological philosophy within the Monas.64 In the Neoplatonic paradigm,
Calder underlines Proclus (and ancient mathematicians such as Theon and Nicomachus)
as a figure of important influence on Dee’s philosophy in the Monas Hieroglyphica in
terms of the notion of One, or Unity.65 Proclus posed a problem wherein the One, or
God, can only be approached by analogy or negation and supplies the analogy that
‘[t]he One is like the sun’s light which illuminates the world and radiates far and wide
while it remains undiminished at its source’.66 Dee seems supremely confident of his
attempt to communicate the One in a single symbol rife with countless analogies:
Though I call it hieroglyphic. he who has examined its inner structure will grant that all the same there is [in it] an underlying clarity and strength almost mathematical, such as is rarely applied in [writings on] matters so rare. Or is it not rare, I ask, that the common astronomical symbols of the planets (instead of being dead, dumb, or, up to the present hour at least, quasi-barbaric signs) should have become characters imbued with immortal life and should now be able to express their especial meanings most eloquently in any tongue and to any nation?67
The recent scholarly opinion regarding the Hermetic element of Dee’s philosophy
as illustrated in the Monas is unified and agreed upon by Walton, Clulee, Szönyi, and
Harkness68 in the following:
Since the Creator made the whole cosmos, not with hands but by the Word, understand that he is present and always is, creating all things, being one alone, and by his will producing all beings.69
Ficino’s reconciliation of his philosophy, magic, and Christianity were highly formative to Dee’s justifications for his questionably heretical angelic conversations. However, Dee also incorporated Kabbalistic elements Ficino eschewed. Ficino’s friend, Giovanni Pico della Mirandola, artfully reconciled Kabbalah with Platonic and Hermetic philosophy, as well as Christianity.70 The connection of the divinity of the cosmos and man’s ability to connect with them through images is granted new depths when combined with the power of names presented in practical Kabbalah, as written by Johannes Reuchlin (1455-1522), and further linked with Hermeticism and Christianity through Pico. Pico’s contribution to the Hermetic-Kabbalistic philosophy most certainly piqued Dee’s interests, as exemplified in his Hermetic-Christian definition of the ‘real Cabbala’ in his Monas Hieroglyphica.
It is fascinating and highly relevant to this essay that Pico proclaimed Ramon Llull’s works, or the Ars Raymundi, to be Kabbalistic.72 Ramón Llull (1232/3-1316) channeled the idea of the Great Chain of Being in his assertion of the capacity of man to ascend the scala naturae, or the ladder of nature, through intellectual contemplation.73 Llull used the combination of a series of nine letters (B, C, D, E, F, G, H, I, and K) representing ‘absolute attributes’, to which nine relations, nine questions, nine subjects, nine virtues, and nine vices were added.74 75 The resulting number of binary combinations was calculated to be 17,804,320,388,674,561, which Llull explored with the use of geometrical figures meant to enumerate the terms and generate combinatorial pairings of the aspects of reality.76 The acceptance of pseudo-Llullian alchemical and Kabbalistic works as authentic in conjunction with his mystic, mathematical diagrams only served to make the Ars Raymundi all the more appealing to Dee.77 Pico argues that Llull’s usage of combining letters of the Hebrew alphabet was not unlike Kabbalistic techniques78 and relied on Llull’s Ars Combinatoria for his own system.79
Regarding Pico’s own system, in his Nine Hundred Theses (1486), he succinctly states his thoughts on Kabbalah and Platonism: That which among the Cabalists is called <[...] Metatron> is without doubt that which is called Pallas by Orpheus, the paternal mind by Zoroaster, the son of God by Mercury, wisdom by Pythagoras, the intelligible sphere by Parmenides.80
He then addresses Kabbalah and Christianity:
11>7. No Hebrew Cabalist can deny that the name Jesus, if we interpret it following the method and principles of the Cabala, signifies precisely all this and nothing else, that is: God the Son of God and the Wisdom of the Father, united to human nature in the unity of assumption through the third Person of God, who is the most ardent fire of love.81
Pico’s clear devotion to Hermetic philosophy was illustrated in the dedication of ten theses to ‘Mercury Trismegistus’ that explicated man’s connection to a living nature, and thus to a God who is present in that life.82 Pico clearly believed in not merely the syncretism of faiths, but the reconciliation of seemingly disparate religious, philosophical, and cultural paradigms.
Johannes Reuchlin boldly deepened the connections between Kabbalah and Christianity in a time when Judaism was defined as a form of Satanism, perhaps even if unwitting.83 Pico’s Theses inspired Reuchlin to write De Verbo Mirifico (1494) in defense of Pico, and the central work on Christian Kabbalah, De Arte Cabalistica (1517).84 In De Verbo Mirifico, Reuchlin presented what he believed to be the reality and name of the Christian God made known through the Son in the pentagrammaton, the five lettered name he believed to signify Jesus Christ.85 De Verbo Mirifico was listed in Dee’s catalogue and it is quite likely Dee was familiar with its material based on the tone of his magical practices86 and some of the aphorisms in the Propaedeumata Aphoristica.87 Through Pico and Reuchlin, the idea that the presence of God existed in images was expanded to include names of power.88 This presentation of the Kabbalah in a Christian, magical context was a crucial element to Dee’s practice.89
The encoding of the Sigillum Dei Aemeth,90 the Kings and Princes of the Heptarchia Mystica, and the divine names of the nations of the world and the angels overseeing them in the Liber Scientiae Auxilii all go to great lengths to identify the names of the angels.91 Dee presumably considered the use of these names crucial to contacting the angels in order to achieve divine understanding related to their offices, though there are no existing records of Dee ever using the names and orations described in the aforementioned books in such a way.
The significant link between Pico and Dee was the transmission of the combined Hermetic, Kabbalistic, and Platonic ideas through Agrippa’s De Occulta Philosophia Libri Tres (1533), especially in regards to the threefold world (elementary, celestial, and intellectual/supercelestial)92 93 that Dee presents in his Mathematicall Praeface to the Elements of Geometrie of Euclid of Megara (1570). Dee utilized this threefold world as the basis of his supercelestial magic dealing with ‘intelligences’ or angels.94 His
treatment of the threefold world in the Mathematical Preface follows:
All thinges which are, & haue beyng, are found vnder a triple diuersitie generall. For, either, they are demed Supernaturall, Naturall, or, of a third being [...] which, by a peculier name also, are called Thynges Mathematicall.95
The linkage between the emanations of God in Neoplatonism influencing
Kabbalistic works has been conjectured, but regardless of such a connection,96 the
theological philosophies seemed to have been more separated by the cultures that
espoused them rather than the actual contents of their literature.97 The inclusion of
Kabbalah into the Neoplatonic and Hermetic philosophy under the auspices of a deeper
Christianity influenced Dee’s thought, and eventually his magical practice. This will be
evidenced and examined in greater depth in the following section treating his angelic
conversations.
www.academia.edu/921740/Enochian_Angel_Magic_From_John_De...
I just did a fireside chat with Stanford President John Hennessy and Eric Schmidt (who is now on his was to LA to announce the Android music deal).
Brilliant minds with a similar longing for a respect for data and truth.
Here are some of the questions I prepared that we did not get to as even bigger issues loomed in their minds:
On the topic of meaningful innovation — where does it come from, how can we foster it, what can we learn over time about the process of innovation vs. the product of innovation (e.g., tuning the parameters of communication and team size vs. target setting and visionary leadership).
The topics could naturally turn to globalization and competitiveness - the fractal fates of people, companies and nations. Do they embrace the primary vectors of change and growth or retreat to atavistic comforts? For how long can someone opt out of progress and still catch up? In an era of exponential change, the sea change of history has become the drumbeat of decades... with a ever-quickening cadence.
I am personally very interested in the dynamics of accelerating technological change and the societal implications on the education imperative (and adult reeducation imperative, as careers no longer last a lifetime) and the rich-poor gap in modern economies like the U.S. (network effects -> power law in income distribution).
I am also interested in disruptive entrepreneurship, the change agents of society. To the extent that all good ideas are a combinations of prior ideas (Stuart Kauffman, Matt Ridley, Kevin Kelly), the combinatorial explosion of possibility space may explain accelerating change, and the disruptive power of interdisciplinary idea-pairings could be compared to the differential immunity of epidemiology (islands of cognitive isolation — a.k.a. academic disciplines — are vulnerable to disruptive memes much like South America was to smallpox from Cortés and the Conquistadors). If disruption is what you seek, cognitive island-hopping is good place to start, mining the interstices between academic disciplines.
When we consider the combinatorial explosion of possibly interacting ideas as the fountainhead of innovation, it not only creates the economy and explains accelerating change, it also subsumes biological evolution (raising the primary vector of progress to a higher level of abstraction) and nurtures a rational optimism for the future.
And some quotes form my talk this morning:
“All technologies are combinations of technologies that already exist.” — Brian Arthur
• Combinatorial Explosion (explains accelerating change in technology)
• Creates Economy
“Science quickly became the greatest tool for making new things the world has ever seen. Science was in fact a superior method for a culture to learn.” — Kevin Kelly
“The average standard of living in London went up 50% from the time of Pericles to 1820.It went up another 50% in one lifespan from 1820 to 1865, and we saw the power of the Industrial Revolution.And now, the standard of living goes up 50% every five years in China.” — Larry Summers
“Throughout history, the engine of human progress has been the meeting and mating of ideas to make new ideas. It is our habit of trade, idea-sharing and specialization that has created the collective enterprise which set human living standards on a rising trend. The human race will prosper mightily in the years ahead, because ideas are having sex with each other as never before.” — Matt Ridley
• Urbanization (cities are more innovative per capita)
• Interdisciplinary Disruption (differential immunity is a benefit for disruptors)
• Globalization (global idea sex facilitated by the Internet. Unveiling pockets of isolation)
“Computing is undergoing the most remarkable transformation since the invention of the PC. The innovation of the next decade is going to outstrip the innovation of the past three combined.”
– Intel CEO Paul Otellini, Sept. ‘11
I don't love make-up. I prefer a woman without make-up. I will like make-up if I cannot perceive it.
In the same vein, I rarely allow myself to go beyond certain limits when processing a photo; I could define that limit as "one should not perceive the trick". This is an exception to this rule, of course*. I wanted to explore further this beautiful Medici lion - how artificial lightning played with its surfaces - so I have decided to go well beyond myself and post this experiment in Rodilius.
* One could wonder "What about your texturization work?". It is not a bad question, and a tantalizing starting point for a reflection on the matter.
Texturization is quite another thing, to me. The heart of the matter is: Rodilius is an effect, i.e. a set of mathematical functions with a user interface allowing the user to change several parameters of the functions themselves. The result of the processing depends partly upon the processed image, and mostly upon how the parameters are set. The results of a processing of this kind is somehow deterministic - in principle, one could produce a (vast) combinatorial catalogue of the results of Rodilius processing upon an image by varying one by one the parameters. And this reasoning is valid for all the effects one can find in raster graphics editors like Photoshop, The Gimp, Corel PhotoPaint, etc.
Texturization is a kind of processing more akin to an art of its own. How can the mood or general atmosphere of a photo change owing to a texturization processing? who can say! It is entirely at the creativity of the artist. The result of texturization depends mostly upon the chosen texture/textures, the modes of blending, the choice of masking parts of a texture to cancel its effects on part of the image... This is clearly a creative process in the most traditional meaning.
There are even those who texturize other people's photos and think of the texturized image as a work of theirs - which is not so strange, if one considers the ancient and perfectly legitimate practice of parody in music. A composer could take a theme from one of her previous works or by another composer and rework it in a completely different way, typically by a different kind of contrapunctal elaboration. Bach and Palestrina, to give a couple of very big names, usually practised parody; but here I would give as an example a case from my experience as a singer. In Vivaldi's Gloria in D major RV 589 the last section, Cum Sancto Spiritu, which I have had a lucky chance to sing, is a parody from the corresponding section of Giovanni Maria Ruggieri's 1708 Gloria. Vivaldi's elaboration of Ruggieri's theme results in a beautiful, powerful fugue a 4 which can make you weep of joy, when you are immersed in its texture as a singer. There is simply no way to claim that Vivaldi "copied" or "stole" or "plagiarize" Ruggieri, since Vivaldi's elaboration of the theme is an entirely original creation.
I think that the French secular song L'homme armé holds the record for the most parodized melody: over 40 compositions, mostly Masses, has been derived from its tune in the Renaissance period; and every one of these compositions is an individual work with a life of its own.
So I regard texturization as an art akin to parody in music; and I think that one should not define it simply as a way of "processing" an image, but rather as a "re-creation" of the image itself.
You can find details about the sculpture and the HDR processing of the original image in Beware of the kitten
The Silicon Graphics head in my office was my muse. I just finished reading a fascinating summary by Lin & Tegmark of the tie between the power of neural networks / deep learning and the peculiar physics of our universe. The mystery of why they work so well may be resolved by seeing the resonant homology across the information-accumulating substrate of our universe, from the base simplicity of our physics to the constrained nature of the evolved and grown artifacts all around us. The data in our natural world is the product of a hierarchy of iterative algorithms, and the computational simplification embedded within a deep learning network is also a hierarchy of iteration. Since neural networks are symbolic abstractions of how the human cortex works, perhaps it should not be a surprise that the brain has evolved structures that are computationally tuned to tease apart the complexity of our world.
Does anyone know about other explorations into these topics?
Here is a collection of interesting plain text points I extracted from the math in Lin & Tegmark’s article:
"The exceptional simplicity of physics-based functions hinges on properties such as symmetry, locality, compositionality and polynomial log-probability, and we explore how these properties translate into exceptionally simple neural networks approximating both natural phenomena such as images and abstract representations thereof such as drawings. We further argue that when the statistical process generating the data is of a certain hierarchical form prevalent in physics and machine-learning, a deep neural network can be more efficient than a shallow one. Various “no-flattening theorems” show when these efficient deep networks cannot be accurately approximated by shallow ones without efficiency loss."
This last point reminds me of something I wrote in 2006: "Stephen Wolfram’s theory of computational equivalence suggests that simple, formulaic shortcuts for understanding evolution (and neural networks) may never be discovered. We can only run the iterative algorithm forward to see the results, and the various computational steps cannot be skipped. Thus, if we evolve a complex system, it is a black box defined by its interfaces. We cannot easily apply our design intuition to the improvement of its inner workings. We can’t even partition its subsystems without a serious effort at reverse-engineering." — 2006 MIT Tech Review
Back to quotes from the paper:
Neural networks perform a combinatorial swindle, replacing exponentiation by multiplication: if there are say n = 106 inputs taking v = 256 values each, this swindle cuts the number of parameters from v^n to v×n times some constant factor. We will show that this success of this swindle depends fundamentally on physics: although neural networks only work well for an exponentially tiny fraction of all possible inputs, the laws of physics are such that the data sets we care about for machine learning (natural images, sounds, drawings, text, etc.) are also drawn from an exponentially tiny fraction of all imaginable data sets. Moreover, we will see that these two tiny subsets are remarkably similar, enabling deep learning to work well in practice.
Increasing the depth of a neural network can provide polynomial or exponential efficiency gains even though it adds nothing in terms of expressivity.
Both physics and machine learning tend to favor Hamiltonians that are polynomials — indeed, often ones that are sparse, symmetric and low-order.
1. Low polynomial order
For reasons that are still not fully understood, our universe can be accurately described by polynomial Hamiltonians of low order d. At a fundamental level, the Hamiltonian of the standard model of particle physics has d = 4. There are many approximations of this quartic Hamiltonian that are accurate in specific regimes, for example the Maxwell equations governing electromagnetism, the Navier-Stokes equations governing fluid dynamics, the Alv ́en equations governing magnetohydrodynamics and various Ising models governing magnetization — all of these approximations have Hamiltonians that are polynomials in the field variables, of degree d ranging from 2 to 4.
2. Locality
One of the deepest principles of physics is locality: that things directly affect only what is in their immediate vicinity. When physical systems are simulated on a computer by discretizing space onto a rectangular lattice, locality manifests itself by allowing only nearest-neighbor interaction.
3. Symmetry
Whenever the Hamiltonian obeys some symmetry (is invariant under some transformation), the number of independent parameters required to describe it is further reduced. For instance, many probability distributions in both physics and machine learning are invariant under translation and rotation.
Why Deep?
What properties of real-world probability distributions cause efficiency to further improve when networks are made deeper? This question has been extensively studied from a mathematical point of view, but mathematics alone cannot fully answer it, because part of the answer involves physics. We will argue that the answer involves the hierarchical/compositional structure of generative processes together with inability to efficiently “flatten” neural networks reflecting this structure.
A. Hierarchical processes
One of the most striking features of the physical world is its hierarchical structure. Spatially, it is an object hierarchy: elementary particles form atoms which in turn form molecules, cells, organisms, planets, solar systems, galaxies, etc. Causally, complex structures are frequently created through a distinct sequence of simpler steps.
We can write the combined effect of the entire generative process as a matrix product.
If a given data set is generated by a (classical) statistical physics process, it must be described by an equation in the form of [a matrix product], since dynamics in classical physics is fundamentally Markovian: classical equations of motion are always first order differential equations in the Hamiltonian formalism. This technically covers essentially all data of interest in the machine learning community, although the fundamental Markovian nature of the generative process of the data may be an in-efficient description.
Summary
The success of shallow neural networks hinges on symmetry, locality, and polynomial log-probability in data from or inspired by the natural world, which favors sparse low-order polynomial Hamiltonians that can be efficiently approximated. Whereas previous universality theorems guarantee that there exists a neural network that approximates any smooth function to within an error ε, they cannot guarantee that the size of the neural network does not grow to infinity with shrinking ε or that the activation function σ does not become pathological. We show constructively that given a multivariate polynomial and any generic non-linearity, a neural network with a fixed size and a generic smooth activation function can indeed approximate the polynomial highly efficiently.
The success of deep learning depends on the ubiquity of hierarchical and compositional generative processes in physics and other machine-learning applications.
And thanks to Tech Review for the pointer to this article:
I recently uncovered a trippy little piece I wrote on constructive constructions for the creatives at ARUP:
Evolving Cities and Culture
Innovation is critical to economic growth, progress, and the fate of the planet. Yet, it seems so random. But patterns emerge in the aggregate, and planners and politicians may be able to promote innovation and growth despite the overall inscrutability of this complex system.
One emergent pattern, spanning centuries, is that the pace of innovation is perpetually accelerating, and it is exogenous to the economy. Rather, it is the combinatorial explosion of possible innovation-pairings that creates economic growth. And that is why cities are the crucible of innovation.
Geoffrey West of the Santa Fe Institute argues that cities are an autocatalytic attractor and amplifier of innovation. People are more innovative and productive, on average, when they live in a city because ideas can cross-pollinate more easily. Proximity promotes propinquity and the promiscuity of what Matt Ridley calls “ideas having sex”. This positive network effect drives another positive feedback loop - by attracting the best and the brightest to flock to the salon of mind, the memeplex of modernity.
Cities are a structural manifestation of the long arc of evolutionary indirection, whereby the vector of improvement has risen steadily up the ladder of abstractions from chemicals to genes to systems to networks. At each step, the pace of progress has leapt forward, making the prior vectors seem glacial in comparison – rather we now see the nature of DNA and even a neuron as a static variable in modern times. Now, it’s all about the ideas - the culture and the networks of humanity. We have moved from genetic to mimetic evolution, and much like the long-spanning neuron (which took us beyond nearest neighbor and broadcast signaling among cells) ushering the Cambrian explosion of differentiated and enormous body plans, the Internet brings long-spanning links between humans, engendering an explosion in idea space, straddling isolated pools of thought.
And it’s just beginning. In the next 10 years, four billion minds will come online for the first time to join this global conversation (via Starlink broadband satellites).
But why does this drive innovation and accelerating change? Start with Brian Arthur’s observation that all new technologies are combinations of technologies that already exist. Innovation does not occur in a vacuum; it is a combination of ideas from before. In any academic field, the advances today are built on a large edifice of history. This is the foundation of progress, something that was not so evident to the casual observer before the age of science. Science tuned the process parameters for innovation, and became the best method for a culture to learn.
From this conceptual base, come the origin of economic growth and accelerating technological change, as the combinatorial explosion of possible idea pairings grows exponentially as new ideas come into the mix (on the order of 2^n of possible groupings per Reed’s Law). It explains the innovative power of urbanization and networked globalization. And it explains why interdisciplinary ideas are so powerfully disruptive; it is like the differential immunity of epidemiology, whereby islands of cognitive isolation (e.g., academic disciplines) are vulnerable to disruptive memes hopping across, much like South America was to smallpox from Cortés and the Conquistadors. If disruption is what you seek, cognitive island-hopping is good place to start, mining the interstices between academic disciplines.
So what evidence do we have of accelerating technological change? At Future Ventures, we see it in the diversity and quality of the entrepreneurial ideas arriving each year across our global offices. Scientists do not slow their thinking during recessions.
For a good mental model of the pace of innovation, consider Moore’s Law in the abstract – the annual doubling of compute power or data storage. As Ray Kurzweil has plotted, the smooth pace of exponential progress spans from 1890 to today, across countless innovations, technology substrates, and human dramas — with most contributors completely unaware that they were fitting to a curve.
Moore’s Law is a primary driver of disruptive innovation – such as the iPod usurping the Sony Walkman franchise – and it drives not only IT and communications, but also now genomics, medical imaging and the life sciences in general. As Moore’s Law crosses critical thresholds, a formerly lab science of trial and error experimentation becomes a simulation science and the pace of progress accelerates dramatically, creating opportunities for new entrants in new industries. And so the industries impacted by the latest wave of tech entrepreneurs are more diverse, and an order of magnitude larger — from automobiles and rockets to energy and chemicals.
At the cutting edge of computational capture is biology; we are actively reengineering the information systems of biology and creating synthetic microbes whose DNA was manufactured from bare computer code and an organic chemistry printer. But what to build? So far, we largely copy large tracts of code from nature. But the question spans across all the complex systems that we might wish to build, from cities to designer microbes, to computer intelligence.
As these systems transcend human comprehension, will we continue to design them or will we increasingly evolve them? As we design for evolvability, the locus of learning shifts from the artifacts themselves to the process that created them. There is no mathematical shortcut for the decomposition of a neural network or genetic program, no way to "reverse evolve" with the ease that we can reverse engineer the artifacts of purposeful design. The beauty of compounding iterative algorithms (machine learning, evolution, fractals, organic growth, art) derives from their irreducibility.
And what about human social systems? The corporation is a complex system that seeks to perpetually innovate. Leadership in these complex organizations shifts from direction setting to a wisdom of crowds. And this “process learning” is a bit counterintuitive to some alpha leaders: cognitive diversity is more important than ability, disagreement is more important than consensus, voting policies and team size are more important than the coherence or comprehensibility of the decisions, and tuning the parameters of communication (frequency and fanout) is more important than charisma.
The same could be said for urban planning. How will cities be built and iterated upon? Who will make those decisions and how? We are just starting to see the shimmering refractions of the hive mind of human culture, and now we want to redesign the hives themselves to optimize the emergent complexity within. Perhaps the best we can do is set up the grand co-evolutionary dance and listen carefully for the sociobiology of supra-human sentience.
-----------
I first brainstormed about reinventing construction with Astro Teller and Sebastian Thrun when they were forming Google X and looking for the largest markets in the world that look ripe for disruption from advancing information technology and machine learning. The $10 trillion spent each year on buildings certainly qualified, and the global construction industry is growing from 13% of the entire global economy to 15% in 2020. Helix.re became the first Google X spinout, taking a data and software-driven approach to building design and optimization.
I have been playing around with the lightroom beta. This is a re-processing of a picture I originally posted 2 years ago.
I think in the original I over-cooked the white balance and exposure and prefer this more subtle treatment. I also adjusted the crop. Let me know which you think is better. Oh, you've got to view this on black...
Tic-tac-toe
From Wikipedia:
Tic-tac-toe, also called noughts and crosses, hugs and kisses, and many other names, is a pencil-and-paper game for two players, O and X, who take turns marking the spaces in a 3×3 grid, usually X going first. The player who succeeds in placing three respective marks in a horizontal, vertical or diagonal row wins the game. This game is won by the first player, X:
Players soon discover that best play from both parties leads to a draw. Hence, tic-tac-toe is most often played by young children; when they have discovered an unbeatable strategy they move on to more sophisticated games such as dots and boxes. This reputation for ease has led to casinos offering gamblers the chance to play tic-tac-toe against trained chickens - though the chicken is advised by a computer program.
The simplicity of tic-tac-toe makes it ideal as a pedagogical tool for teaching the concepts of combinatorial game theory and the branch of artificial intelligence that deals with the searching of game trees. It is straightforward to write a computer program to play tic-tac-toe perfectly, to enumerate the 765 essentially different positions (the state space complexity), or the 26,830 possible games up to rotations and reflections (the game tree complexity) on this space.
The first known video game, OXO (or Noughts and Crosses, 1952) for the EDSAC computer played perfect games of tic-tac-toe against a human opponent.
One example of a Tic-Tac-Toe playing computer is the Tinkertoy computer, developed by MIT students, and made out of Tinker Toys. It only plays Tic-Tac-Toe, and has never lost a game. It is currently on display at the Museum of Science, Boston.
Cyanotype, traditional iron salt party mix, combinatorially grappled in a head-shaped tub, brushed onto gelatin-sized vellum, subsequently exposed to Sol for an amount of time -- in the winter Texas air for ten minutes perhaps -- Finally, developed casually, while smoking, in water, vinegar, ammonia and tea-tannins.
Astro Teller, grandson of the hydrogen bomb and Moonshot maven, introducing me at Google. The video of my talk just went up.
---------------
ABSTRACT
Many of the interesting challenges in computer science, nanotechnology, and synthetic biology entail the construction of complex systems. As these systems transcend human comprehension, will we continue to design them or will we increasingly evolve them? As we design for evolvability, the locus of learning shifts from the artifacts themselves to the process that created them. There is no mathematical shortcut for the decomposition of a neural network or genetic program, no way to "reverse evolve" with the ease that we can reverse engineer the artifacts of purposeful design. The beauty of compounding iterative algorithms (evolution, fractals, organic growth, art) derives from their irreducibility.
Google itself is a complex system that seeks to perpetually innovate. Leadership in complex organizations shifts from direction setting to a wisdom of crowds. The role of upper management is to tune the parameters of communication. Leaders can embrace a process that promotes innovation with emergent predictability more than they can hope to dictate the product of innovation itself.
Innovation is critical to economic growth, progress, and the fate of the planet, yet it seems so random. While innovation may appear inscrutable at the atomic level, patterns emerge in the aggregate nonetheless. A critical pattern, spanning centuries, is that the pace of innovation is perpetually accelerating, and it is exogenous to the economy. Rather, it is the combinatorial explosion of possible innovation-pairings that creates economic growth.
---------------
(more on the dichotomy of design and evolutionary search, organizational optimization, and innovation)
I arranged the talk to overlap with a SFI brain spa @ Google. Some quotes from that event (without attribution per Chatham House rule):
“Unlimited power limits intellectual parsimony.”
“With machine learning, we are creating electronic savants. They are happy in a high-dimensional space. They have no desire to reduce. What we want is electronic Keplers that can recognize the ellipse, not savants that can force fit a heliocentric model.”
“The target of evolution can’t be more complex than the selection pressure itself. If you can come up with the selection pressure, you might as well design it.”
“I don’t think there is any natural process that is incompressible. It’s not random.”
[I disagree with the premise of those last two quotes]
Which will come first? Green vs. Grey, as they say.
Thanks to Ariel Poler for hosting a SF Salon on the subject with Erik Torenberg of Village Global and Silicon Foundry.
I think we will build a superhuman AGI before we understand our own brain well enough to radically improve it or upload it to a silicon substrate. The complex creations of iterative algorithms (like evolution and deep learning) are inherently inscrutable. It is easier to push evolution forward than to reverse engineer the products of evolution.
We are in the middle of a sea change in how the vanguard of engineering will be done. Building complex systems that exceed human understanding is more like parenting than programming. The locus of learning shifts from end products to the process of their creation. An ever-growing percentage of software will be grown and an ever-growing percentage of compute will run on infrastructure that resembles the brain (massively parallel, fine grained architectures with in-memory compute and a growing focus on the memory and interconnect elements). This is the path to AGI, IMHO.
I’ve been working with a neural plasticity company for 14 years now (Posit Science). One of my concerns with uploading is the extreme plasticity of the sensory cortex and the recruitment of neighboring regions in the face of external changes (like phantom limb pain in amputees). Cut and paste of brain state to a foreign substrate may require a deep understanding of the analog domain, where structural topology and functional spike train variation is immense (there are over 300 types of neurons in neocortex that are structurally and electrically different. And each neuron has ~200 ion channels from a pool of 20-40 variations). Furthermore, our mostly 2D silicon substrates lack the interconnect density for a direct map of the synaptic fan-out of the brain. Without a deep understanding of what elements can be ignored or abstracted, a simulation of brain function explodes in combinatorial complexity.
Going back a decade, in talks about AI futures, I was fond of advising to “augment early and often.” I worry that people want to believe in extreme augmentation and uploading, not because it is likely, but because it offers a mental model for “humanity” maintaining the mantle of supremacy, perpetually perched at the pinnacle of evolution. The idea that evolution will eventually progress way beyond us is hard to internalize. We seek transcendence, as the antidote for obsolescence.
I’ll be brainstorming more about storming the brain this evening at a follow up salon.
My 2006 musings on these topics.
Atheist myths debunked - Abiogenesis - the spontaneous generation of life from sterile matter.
Abiogenesis - the atheist and evolutionist belief - that life can spontaneously generate itself from sterile matter, whenever environmental conditions are conducive .... And the belief that this actually happened in the early Earth.
Is it possible?
IMPOSSIBLE ACCORDING TO INFORMATION THEORY.
Three fundamentals are essential for the material universe to exist: matter - energy - information.
Obviously, all theories about how the universe operates, and its origins, must take account of all three. However, every evolutionary, origin of life hypothesis yet devised (primordial soup, hydrothermal vent, etc. etc.) concentrates on the chemistry/physics of life, i.e. the interaction of matter and energy.
Atheists and evolutionists have virtually ignored the essential role and origin of information. We should demand to know why? Especially as we are told (through the popular media and education system) that an evolutionary, origin of life scenario, should be regarded as irrefutable, scientific fact.
Atheists and evolutionists are well aware that the information required for life cannot just arise of its own accord in a primordial soup. So why do they usually omit this crucial fact from their origin of life story?
In order to store information, a storage code is required. Just as the alphabet and language is the code used to store information in the written word, life requires both the information itself, which controls the construction and operation of all living things, and the means of storing that information. DNA is the storage code for living things.
No evolutionary, origin of life hypothesis has ever explained either how the DNA storage system was formed, or how the information encoded within that DNA storage system originated. In fact, even to attempt to look for the origin of information in physical matter is to ignore the natural laws about information.
Information theory completely rules out the spontaneous generation of life from non-life.
Information theory tells us: ANY MODEL FOR THE ORIGIN OF LIFE BASED SOLELY ON PHYSICAL AND/OR CHEMICAL PROCESSES, IS INHERENTLY FALSE. And: THERE IS NO KNOWN LAW OF NATURE, NO KNOWN PROCESS AND NO KNOWN SEQUENCE OF EVENTS, WHICH CAN CAUSE INFORMATION TO ORIGINATE BY ITSELF IN MATTER… So information theory not only rules out all evolutionary hypotheses which cannot explain the origin of information in original life, it also rules out all evolutionary hypotheses which cannot explain the origin of the completely new, increasingly complex information which would be required to be added to a gene pool for progressive evolution to take place in existing life.
Because of their zealous and unshakable faith in Darwinian evolution, most evolutionists choose to ignore this. They simply refuse to face this most important question of all, where does the complex information essential for all life come from? The reason seems obvious, it is because there are only two answers which could be compatible with the evolution fable, both are unscientific nonsense which violate information theory. They are: 1. That information can just arise magically out of nowhere. OR 2. That the material universe is an intelligent entity, which can actually create information.
(See more on genetic information and the DNA code later on)
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE LAW OF BIOGENESIS.
The Law of Biogenesis rules out the spontaneous generation of life from non-living matter under all known circumstances. All modern scientists now accept this well tested law as valid. In fact, the whole concept of medical sterilisation, hygiene & food preservation is totally dependent on this law.
No sensible scientist would dare to claim that spontaneous generation of life ever happens in the world today, and there is no reason whatsoever to believe that this Law (like every natural law) is not always valid, in all places and at all times, within the material universe.
Yet, amazingly, in order to support biological evolution, evolutionists are quite prepared to flout this well, established Law and to resurrect the ancient belief in abiogenesis (life arising from non-life). Like latter-day advocates of the ancient Greek belief (that the goddess Gea could make life arise spontaneously from stones), evolutionists and atheists routinely present to the public, the preposterous notion that, original life on earth (and even elsewhere in the universe) just spontaneously generated itself from inert matter. Apparently, all that was required to bypass this well established Law was a chance accumulation of chemicals in some alchemist’s type brew of ‘primordial soup’ combined with raw energy from the sun, lightning or geothermal forces. (Such is their faith in the creative powers of matter). They call this science? Incredible!
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE SECOND LAW OF THERMODYNAMICS.
The second Law of Thermodynamics rules out the spontaneous generation of life from non-life as a chance event. Even if we ignore the above reasons why spontaneous generation of life is impossible, the formation and arrangement by chance of all the components required for living cells is also impossible. The arrangement of all the components within the simplest of living cells is extremelprecise; these components cannot just arrange themselves by chance.
According to the Second Law of Thermodynamics, when left to themselves, things naturally become more disordered, rather than more ordered. Or in other words, things will naturally go to more probable arrangements and disorder is overwhelmingly more probable than order. Disorder actually increases with the passage of time and also with the application of raw (undirected) energy (for example, heat).
Yet we are repeatedly told the evolution fable, that the numerous components required to form a first, self-replicating, living cell just assembled themselves in precise order, by pure chance, over a vast period of time, aided by the random application of raw, undirected energy.
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO THE LAW OF CAUSE AND EFFECT.
A fundamental principle of science is the law of cause and effect. It is a primary law of science, and the very basis of the scientific method.
The law of cause and effect tells us that an effect cannot be greater than its cause/s.
Life is not an intrinsic property of matter/energy - so it is beyond the capabilities of matter/energy to produce a property (life) it doesn't possess.
The interaction of matter and energy cannot produce an effect with properties extra and superior to its own properties, that would violate the law of cause and effect.
Can chemistry create biology - which has entirely different properties to its own?
Of course it can't.
Biology includes such properties as genetic information, the DNA code, consciousness and intelligence. To believe that chemistry can create biology - means believing that something inanimate can create additional, new properties that it doesn't possess. To exceed the limitations of its own properties would violate the law of cause and effect.
For matter/energy to be able to produce life whenever environmental conditions permit, it would have to be inherently predisposed to produce life.
It would have to embody an inherent plan/blueprint/instructions for life, as one of its properties. The inevitable question then has to be - where does an inherent predisposition for life come from? It can only signify the existence of purpose in the universe and that is something atheists could never accept.
A purpose, order or plan can only come from a planner or intelligent entity. So it is a catch 22 situation for atheists ... the atheist/ evolutionist belief in abiogenesis either violates the law of cause and effect, OR is an admission of purpose in the universe. It can only be one or the other. Atheists cannot possibly accept the existence of purpose in the universe, because that would be the end of atheism. So the atheist belief in abiogenesis violates the law of cause and effect.
Verdict of science - abiogenesis is not possible.
IMPOSSIBLE ACCORDING TO MATHEMATICS.
Even if we ignore the Law of Biogenesis, Information Theory and the Second Law of Thermodynamics (which all completely rule out the spontaneous generation of a living cell from non-living matter). Mathematical probability also rules out the spontaneous generation of life from non-living matter.
The laws of probability are summed up in the Law of Chance. According to this Law, when odds against a chance event are 10 to the power of 15, the chance of that event happening are negligible on a terrestrial scale. At odds of 10 to the power of 50, there is virtually no chance, even on a cosmic scale. The most generous and favourable, mathematical odds against a single living cell appearing in this way by chance are a staggering 10 to the power of 40,000. A more likely calculation would put the odds at an even more awesome 10 to the power of 119,850. Remember odds of 10 to the power of 50 is sufficient to make an event virtually impossible (except, perhaps, by magic!!).
Verdict of science - abiogenesis is not possible
Fred Hoyle, The Big Bang in Astronomy, New Scientist 19 Nov 1981. p.526. On the origin of life in primeval soup.
“I don’t know how long it is going to be before astronomers generally recognise that the combinatorial arrangement of not even one among the many thousands of biopolymers on which life depends could have been arrived at by natural processes here on the Earth. Astronomers will have a little difficulty at understanding this because they will be assured by biologists that it is not so. The biologists having been assured in their turn by others that it is not so. The “others” are a group of persons who believe, quite openly, in mathematical miracles. They advocate the belief that tucked away in nature, outside of normal physics, there is a law which performs miracles.”
“Since science does not have the faintest idea how life on earth originated, it would only be honest to confess this to other scientists, to grantors, and to the public at large. Prominent scientists speaking ex cathedra, should refrain from polarising the minds of students and young productive scientists with statements that are based solely on beliefs.” Bio-informaticist, Hubert P. Yockey. Journal of Theoretical Biology [Vol 91, 1981, p 13].
Conclusion: Abiogenesis is impossible - it is just another atheist myth debunked by science.
Evolutionists and atheists are quite entitled to abandon the scientific method and all common sense by choosing to believe that all the necessary information for life can just appear in matter, as if by magic. They can also choose to believe that: the Laws of; Biogenesis, Mathematical Probability, Cause and Effect and Second Law of Thermodynamics, were all somehow magically suspended to enable their purported evolution of life from sterile matter to take place. They can believe whatever they like. But they have no right to present such unscientific, flights of fancy through the media and our education system, as though they are supported by science.
More about DNA and the origin of life.
The discovery of DNA should have been the death knell for evolution. It is only because atheists and evolutionists tend to manipulate and interpret evidence to suit their own preconceptions that makes them believe DNA is evidence FOR evolution.
It is clear that there is no natural mechanism which can produce constructional, biological information, such as that encoded in DNA.
Information Theory (and common sense) tells us that the unguided interaction of matter and energy cannot produce constructive information.
Do atheists/evolutionists even know where the very first, genetic information in the alleged Primordial Soup came from?
Of course they don't, but with the usual bravado, they bluff it out, and regardless, they rashly present the spontaneous generation of life as a scientific fact.
However, a fact, it certainly isn't .... and good science it certainly isn't.
Even though atheists/evolutionists have no idea whatsoever about how the first, genetic information originated, they still claim that the spontaneous generation of life (abiogenesis) is an established scientific fact, but this is completely disingenuous. Apart from the fact that abiogenesis violates the Law of Biogenesis, the Law of Cause and Effect and the Second Law of Thermodynamics, it also violates Information Theory.
Evolutionists/atheists have an enormous problem with explaining how the DNA code itself originated. However that is not even the major problem. The impression is given to the public by evolutionists that they only have to find an explanation for the origin of DNA by natural processes - and the problem of the origin of genetic information will have been solved.
That is a confusion in the minds of many people that evolutionists/atheists cynically exploit,
Explaining how DNA was formed by chemical processes, explains only how the information storage medium was formed, it tells us nothing about the origin of the information it carries.
To clarify this it helps to compare DNA to other information, storage mediums.
For example, if we compare DNA to the written word, we understand that the alphabet is a tangible medium for storing, recording and expressing information, it is not information in itself. The information is recorded in the sequence of letters, forming meaningful words.
You could say that the alphabet is the 'hardware' created from paper and ink, and the sequential arrangement of the letters is the software. The software is a mental construct, not a physical one.
The same applies to DNA. DNA is not information of itself, just like the alphabet it is the medium for storing and expressing information. It is an amazingly efficient storage medium. However, it is the sequence or arrangement of the amino acids which is the actual information, not the DNA code.
So, if evolutionists are ever able to explain how DNA was formed by chemical processes, it would explain only how the information storage medium was formed. It will tell us nothing about the origin of the information it carries.
Thus, when atheists and evolutionists tell us it is only a matter of time before 'science' will be able to fill the 'gaps' in our knowledge and explain the origin of genetic information, they are not being honest. Explaining the origin of the 'hardware' by natural processes is an entirely different matter to explaining the origin of the software.
Next time you hear evolutionists/atheists skating over the problem of the origin of genetic information with their usual bluff and bluster, and parroting their usual nonsense about science being able to fill such gaps in knowledge in the future, don't be fooled. They cannot explain the origin of genetic information, and never will be able to. The software cannot be created by chemical processes or the interaction of energy and matter, it is not possible. If you don't believe that. then by all means put it to the test, by challenging any evolutionist to explain how genetic information (not DNA) can originate by natural means? I can guarantee they won't be able to do so.
Atheists often argue that the energy from the Sun can overcome the problem of entropy enabling an increase in comlexity that the origin of life requires - because the Earth is an open system, but that is clearly erroneous.
We can see entropy happening here and now, it happens everyday on Earth.
We are living in the OPEN system of the Earth, and yet we are well aware of entropy.
We see that the Sun does not halt or reverse entropy, in fact we see the opposite.
The raw energy and heat from the Sun, unless harnessed, does damage, things all around us obey the law - they deteriorate, rot, erode and decay, they do not naturally improve.
If you paint your house, the Sun, and the weather effects caused by the Sun, will eventually damage the paintwork, it will crack and peel after a few years. The hotter the Sun (the greater the energy input) the quicker it will happen.
Secondly, even if it were true that in an open system things can defy the law of entropy, natural laws are laws for the whole universe, and the universe, as a whole, is a closed system.
So what can we deduce from this?
Can the effects of entropy ever be reversed of halted? Obviously when you paint your house, you are reversing the bad effects of entropy for a short period, but you have to keep doing it, it is not permanent. Moreover, the energy you are using to repair and temporarily reverse the effects of entropy, is directed and guided by your skill and intelligence.
The atheist argument about the Earth being an open system is clearly not a valid one.
There are only 2 ways the effects of entropy can be temporarily decreased, halted or reversed by an input of energy. That is:
1. A directive means guiding the energy input.
OR,
2. A directive or conversion mechanism possessed by the recipient of the energy to utilise it in a constructive way.
For their argument to be valid atheists would have to
explain what it is that guides or directs the energy from the Sun to enable it to perform the task of creating order from disorder in the so-called primordial soup? And they are unable to do so.
Evolutionism: The Religion That Offers Nothing.
www.youtube.com/watch?v=znXF0S6D_Ts&list=TLqiH-mJoVPB...
FOUNDATIONS OF SCIENCE
The Law of Cause and Effect. Dominant Principle of Classical Physics. David L. Bergman and Glen C. Collins
www.thewarfareismental.net/b/wp-content/uploads/2011/02/b...
"The Big Bang's Failed Predictions and Failures to Predict: (Updated Aug 3, 2017.) As documented below, trust in the big bang's predictive ability has been misplaced when compared to the actual astronomical observations that were made, in large part, in hopes of affirming the theory."
Opening on Amazon:
All people can create value—but for that to happen, we need to develop a people-centered, rather than a task-centered, economy. Today, we are very far from that. According to Gallup, of the five billion people on this planet aged fifteen or older, three billion work in some way. Most of them want full-time jobs, but only 1.3 billion have them. Of these, only 13 percent are fully engaged in their work, giving and receiving its full value. This terrible waste of human capacity and mismanagement of people’s desire to create value for each other is more than just very bad business. It is an insult to ourselves and to all human beings.
CHAPTER 5. Accelerating Towards a Jobless Future:
The Rise of the Machine and the Human Quest for Meaningful Work by Steve Jurvetson and Mo Islam
A New Paradigm
Let’s go far enough in the future where no one will debate the sweeping transition of time. There are infinite possible paths to this distant future, but we can imagine reasonable endpoints. This future will look like much of human history prior to the industrial and agricultural revolutions, where serfs and slaves did most of the labor-intensive work in the city-state economies. But while we hope the arc of the moral universe continues to bend towards justice, there will be a new paradigm in master and slave relationship between man and machine. The slaves of the future will be our machines.
There won’t be many jobs in the sense that we think of them for most people today. Machines will take over mechanically repetitive tasks. Humans will ever only need to do this type of work if they choose to, but they will not provide the most efficient means to complete these tasks. Even highly skilled workers, such as engineers, doctors, and scientists, will have their professions disrupted by automation and artificial intelligence. We will automate engineering, we will automate diagnosis, and we will automate discovery of scientific principles. In this future, where the marginal cost of labor is zero and where companies have reached new bounds of profit maximization, both the microeconomics of individual companies and the macroeconomics of the global economy will be completely upended. Maslow’s hierarchy of needs—food, shelter, health care, education—will be free for everyone forever. We won’t need to work to achieve the basic building blocks of sustainable civilization. The only important human need that will be amplified in this distant future even more than it is now is the desire for meaning.
Humanity’s Compounding Capacity to Compute
First, we will lay a framework for understanding why we believe this is a possible future. We are already on the trajectory to get us there—we have been since the dawn of the industrial age. Humanity’s capacity to compute has been constantly compounding. Incredibly, it can be explained through a simple and elegant model that, at first glance, may seem narrow in its explanatory power, but that tells a much deeper story. That model to describe this macrotrend begins with Moore’s Law. Moore’s Law is commonly reported as a doubling of transistor density every eighteen months. But unless you work for a chip company and focus on fab-yield optimization, you do not care about the transistor counts that Gordon Moore originally wrote about. When recast as a computational capability, Moore’s Law is no longer a transistor-centric metric.
What Moore observed in the belly of the early integrated circuit industry was a derivative metric, a refraction of a longer-term trend, a trend that begs various philosophical questions and predicts mind-bending futures. Ray Kurzweil’s abstraction of Moore’s Law shows computational power on a logarithmic scale and finds a double exponential curve that holds over 110 years! A straight line would represent a geometrically compounding curve of progress.
Figure 1: Ray Kurzweil’s abstraction of Moore’s Law. Each dot is a computer. (older version)
Through five paradigm shifts—such as electromechanical calculators and vacuum tube computers—the computational power that $1,000 buys has doubled every two years. For the past thirty years, it has been doubling every year.
Each dot is the frontier of computational price performance of the day. One machine was used in the 1890 census; one cracked the Nazi Enigma cipher in World War II; one predicted Eisenhower’s win in the 1956 presidential election. Many of them can be seen in the Computer History Museum. Each dot represents a human drama. Prior to Moore’s seminal paper in 1965, which presented what later became known as Moore’s Law, none of them even knew they were on a predictive curve. Each dot represents an attempt to build the best computer with the tools of the day. Of course, we use these computers to make better design software and manufacturing control algorithms. And so the progress continues.
Notice also that the pace of innovation is exogenous to the economy. The Great Depression and the world wars and various recessions do not introduce a meaningful change in the long-term trajectory of Moore’s Law. Certainly, the adoption rates, revenues, profits, and economic fates of the computer companies behind the various dots on the graph may go through wild oscillations, but the long-term trend emerges nevertheless.
In the modern era of accelerating change in the tech industry, it is hard to find even five-year trends with any predictive value, let alone trends that span the centuries. We would go further and assert that this is the most important graph ever conceived, and this is why it is so important as a foundation for understanding the future. We humans, regardless of external factors such as war, disease, and failing economies, have over vast periods of time doubled our capabilities to produce new technologies to propel us forward.
Accelerating Technological Progress
Moore’s law has set the bar for the accelerating pace of computation and innovation. How can we expect it to keep accelerating to get even faster now to the distant future we describe? All new technologies are combinations of technologies that already exist. Innovation does not occur in a vacuum; it is a combination of ideas from before. In any academic field, the advances today are built on a large edifice of history. This is why major innovations tend to be “ripe” and tend to be discovered at nearly the same time by multiple people. The compounding of ideas is the foundation of progress, something that was not so evident to the casual observer before the age of science. Science tuned the process parameters for innovation and became the best method for a culture to learn.
From this conceptual base comes the origin of economic growth and acceleration of technological change, as the combinatorial explosion of possible idea pairings grows exponentially as new ideas come into the mix, as dictated by Reed’s Law. It explains the innovative power of urbanization and networked globalization. And it explains why interdisciplinary ideas are so powerfully disruptive; it is like the differential immunity of epidemiology, whereby islands of cognitive isolation (e.g., academic disciplines) are vulnerable to disruptive memes hopping across them, in much the same way that South America was vulnerable to smallpox from Cortés and the Conquistadors. If disruption is what you seek, cognitive island hopping is good place to start, mining the interstices between academic disciplines.
It is the combinatorial explosion of possible innovation-pairings that creates economic growth, and it is about to go into overdrive. In recent years, we have begun to see the global innovation effects of a new factor: the Internet. People can exchange ideas as never before. Long ago, people were not communicating across continents; ideas were partitioned, and so the success of nations and regions pivoted on their own innovations. Richard Dawkins states that in biology it is genes which really matter, and we as people are just vessels for the conveyance of genes. It is the same with ideas or “memes.” We are the vessels that hold and communicate ideas, and now that pool of ideas percolates on a global basis more rapidly than ever before.
Rise of the Machines
Moore’s Law provides the model for us to understand humanity’s continuous compounding capacity to compute—with that we have accelerating technological progress driven by the combinatorial explosion of new ideas by ever-increasing sub-groups of cognitively diverse people becoming connected. However, the ramifications of this longer-term trend will start to become apparent in the very short term. We believe the greatest disruptor for job displacement caused by this accelerating innovation is the self-driving car.
In five years, it will be clear that the debate about the rise of the autonomous vehicle will have ended. Everyone will realize its ubiquity, especially as the first city pilots with autonomous vehicles begin rolling out. The Google car has already driven over a million miles without causing an accident. Automotive original equipment manufacturers and new companies are investing massive amounts of capital and engineering manpower to get to market with fully (Level 4) autonomous cars. The commercialization path of these self-driving cars, whether through an Uber-like on-demand service or through direct sales to consumers, is less important than the enormous impact they will have on the global job market. Using global employment data from the International Labour Organization (ILO), we find that by 2019, 5.7 percent of global employment will be in the transport, storage, and communication sector (See Figure 2). Moreover, the distribution of employment status data shows us that globally more than 60 percent of all workers lack any kind of employment contract, with most of them engaged in unpaid or family work in the developing world (See Figure 3). We find that, of workers worldwide who have a paid full-time job (excluding temporary workers), almost 20 percent drive as their form of employment today!
And autonomous vehicles are only the tip of the iceberg. As these systems transcend human comprehension, we will shift from traditional engineering to evolutionary algorithms and iterative learning algorithms such as deep learning and machine learning. While these techniques are powerful, the locus of learning shifts from the artifacts themselves to the process that created them. The beauty of compounding iterative algorithms (evolution, fractals, organic growth, art) derives from their irreducibility. And it empowers us to design complex systems that exceed human understanding, which we increasingly need to do at the cutting edge of software engineering. This process presents a plausible path to general artificial intelligence, or what Ray Kurzweil and others refer to as “strong A.I.” Danny Hillis summarizes succinctly in the conclusion from his programming primer The Pattern on the Stone: “We will not engineer an artificial intelligence; rather we will set up the right conditions under which an intelligence can emerge. The greatest achievement of our technology may well be creation of tools that allow us to go beyond engineering—that allow us to create more than we can understand.” Once we build these systems that surpass human understanding and that may even surpass human intelligence, the number of jobs that will be overhauled is unbounded—leading us to a future where no one will have to work.
Figure 2: Employment growth by sector, in which transport is one of the fasting growing.
Figure 3: Distribution of employment status, showing that only 40 percent of people have full-time jobs
Meaningful Work
Moore’s Law will drive human innovation forward and the collective global intelligence will create new forms of super artificial intelligence that can surpass human capabilities. This will completely disrupt our notion of jobs. Work is now the very thing that powers our global economy. But what happens when it no longer has to? Or at least, when most humans are no longer the aggregate primary drivers of global work, how will we find meaning in our lives? This existential phenomenon is one that will completely turn the current debate about the race against the machine on its head: the debate will no longer be about machines taking human jobs but instead about humans needing meaning in their work, even though it may no longer be for employment. The nature of jobs as we think about them today will dramatically change in the future, but humans will retain their thirst for deriving purpose from their actions. This is already becoming a major focus for employers now, as millennials entering the job market are interested in more than just salary, benefits, and job security to satisfy their work expectations. They want to be a part of something larger, to fulfill a mission that can really change the world. As we look to this distant future where employment isn’t necessary for most humans, finding meaning through non-traditional forms of work, whether hobbies, research, or entertainment will become paramount to sustaining a thriving civilization.
A pair of elders, in apparent peregrination, decides to encamp to clean his gabbeh. It is in this moment where there appears the one that will be, of now in more narrating delegate of the statement: Gabbeh, he to be told to the pair by the incidents for which it had to cross to obtain the love of his dear rider since his father was opposed to the marriage. With an epic structure, to the way brechtiana, the speech is displeased behind in the time, the film shows the different anecdotes that, we suppose, the pair of old men had to cross to be able to anchor in the present that gives beginning to the film. Across the direct cut and stopping I go on to a voice over, we accede to the life of this community from the point of view of the protagonist.The portrait, which allows the narrator to raise the image to the statute of the documentary - we see how the men shear the sheeps, how the women tiñen the wool, how the carpets are woven across the different episodes - also it allows him to articulate, across the images, a history while such.The film tries to be placed in this moment from which those dispersed impressions get up-to-date in a structure that he them supports and that can realize of one exist. In this moment in which all that that was appearing as weak capture forms, in this specific case the instant in which this pair is recognized, but that can assimilate the moment in which a community registers as such. To realize of this - and here it would enter axis of oppositions with certain narrative models and theories that realize of the same ones - the accent is not put in the orality but in the images. And not only in the images while possibility of reproducing them of the cinematographic device, but in these images that have remained come down to a remaining place like, for example, the motives of the gabbeh.Appealing to the cinematographic thing, to turn this one it restores the question for that one that, in his worry for the technological development and his zeal to transform in an interchangeable object in the logic of the consumption, the cinema is leaving behind: the possibility of possessing blocks of movement - duration.Far from the melancholy, near Achilles` paradox and the tortoise, the accent puts in the resources that come given in the device as such: the possibility of the setting, of the size and the duration of the planes, of the lights and the shades and, fundamentally, the colors.If in The Silence (1998) the worry was happening for the expressive possibilities of the sound and therefore of the ear while sensitive faculty,here the emphasis is put in the visual thing.Across a putting in austere scene, which leases are natural spaces and the integral actors of the same tribe, the film reinforces that one of which still there is possible a cinema that moves away from the way of narrating of the overproductions. Possible neither in the measurement in which presupposed millionaires do not need, nor the last model of chamber to count. In this respect, it is enough to remember the sequence in which the uncle of Gabbeh teaches to the boys of the tribe the primary colors and his combinatorial possibilities: Blue as that of the sky, yellow as that of the sunflowers.Blue with yellow is equal to green, as that of the pastures. Since in a spectacle of magic, the teacher seizes a piece of sky, of pasture, of flower and of out of field the color brings them over to his pupils: we. The invitation, more that to the didactismo, is to the surprise and to the honoring. Since for example to Meliès, and the allusion would not become exhausted there to the origins of the cinema.The images, specially across the long shots of landscapes - together with with the phrases that float on the text: " the life is a color ", " the man is a color ", " the love is a color " ... and the death also is a color, but negro - they fascinate. And this fascination does not exclude the narrator.As in The Passenger (1975) of Antonioni, the chamber settles in these landscapes that while they cannot be awarded reference to a point of view to any personage of the history, and attentive it remains there.In words of the head cameraman of the film: It was this nature the one that was inviting us to place the chamber to such or such a setting, without mattering where! " To end, since we come insisting respect of this New Second Iranian Cinema, every new film enriches to the world. It extends it in the measurement in which it realizes of that in the age of the overproductions and of the apocalyptic predictions I concern of the future of the cinema, still there are images to explore. As he(she) says, floating on the film, an old Iranian proverb: " Seeing is the same thing at that to look ".
Una pareja de ancianos, en aparente peregrinación, decide acampar para limpiar su gabbeh. Es en ese momento donde aparece la que será, de ahora en más narradora delegada del relato: Gabbeh, para contarle a la pareja las peripecias por las que tuvo que atravesar para conseguir el amor de su amado jinete pues su padre se oponía al matrimonio. Con una estructura épica, a la manera brechtiana, el discurso es desplazado atrás en el tiempo, el film muestra las diferentes anécdotas que, suponemos, la pareja de viejos tuvo que atravesar para poder anclar en el presente que da comienzo al film. A través del corte directo y dejando paso a una voz over, accedemos a la vida de esta comunidad desde el punto de vista de la protagonista. El retrato, que le permite al narrador elevar la imagen al estatuto del documental -vemos cómo los hombres esquilan las ovejas, cómo las mujeres tiñen la lana, cómo se tejen las alfombras a través de los distintos episodios- también le permite articular, a través de las imágenes, una historia en tanto tal. El film intenta colocarse en ese momento a partir del cual aquellas impresiones dispersas se actualizan en una estructura que las soporta y que puede dar cuenta de un existir. En ese momento en que todo aquello que aparecía como inconsistente toma forma, en este caso específico el instante en que esta pareja se reconoce, pero que puede asimilarse al momento en el cual una comunidad se inscribe como tal. Para dar cuenta de esto -y aquí entraría en eje de oposiciones con ciertos modelos narrativos y teorías que dan cuenta de los mismos- el acento no está puesto en la oralidad sino en las imágenes. Y no sólo en las imágenes en tanto posibilidad de reproducirlas del aparato cinematográfico, sino en esas imágenes que han quedado reducidas a un lugar remanente como, por ejemplo, los motivos de los gabbeh.Apelando a lo cinematográfico, este volver instaura la pregunta por aquello que, en su preocupación por el desarrollo tecnológico y su afán de transformarse en un objeto intercambiable en la lógica del consumo, el cine va dejando atrás: la posibilidad de contar con bloques de movimiento-duración. Lejos de la melancolía, cerca de la paradoja de Aquiles y la tortuga, el acento se pone en los recursos que vienen dados en el dispositivo como tal: la posibilidad del encuadre, del tamaño y la duración de los planos, de las luces y las sombras y, fundamentalmente, los colores. Si en El Silencio (1998) la preocupación pasaba por las posibilidades expresivas del sonido y por tanto del oído en tanto facultad sensible, aquí el énfasis está puesto en lo visual. A través de una puesta en escena austera, cuyas locaciones son espacios naturales y los actores integrantes de la misma tribu, el film refuerza aquello de que aún es posible un cine que se aleje de la forma de narrar de las superproducciones. Posible en la medida en que no se necesitan presupuestos millonarios, ni el último modelo de cámara para contar. En este sentido, basta recordar la secuencia en que el tío de Gabbeh les enseña a los chicos de la tribu los colores primarios y sus posibilidades combinatorias: Azul como el del cielo, amarillo como el de los girasoles. Azul con amarillo es igual a verde, como el de los pastos. Como en un espectáculo de magia, el maestro agarra un pedazo de cielo, de pasto, de flor y del fuera de campo les acerca el color a sus alumnos: nosotros. La invitación, más que al didactismo, es a la sorpresa y al homenaje. Como por ejemplo a Meliès, y no se agotaría allí la alusión a los orígenes del cine. Las imágenes, especialmente a través de los planos generales de paisajes -conjuntamente con las frases que flotan sobre el texto: "la vida es color", "el hombre es color", " el amor es color"... y la muerte también es color, pero negro- fascinan. Y esta fascinación no excluye al narrador. Al igual que en El Pasajero (1975) de Antonioni, la cámara se posa en esos paisajes que en tanto referencia a un punto de vista no pueden ser adjudicados a ningún personaje de la historia, y atenta permanece allí. En palabras del director de fotografía del film: Era esa naturaleza la que nos invitaba a colocar la cámara a tal o tal encuadre, sin importar dónde!" Para terminar, como venimos insistiendo respecto de este Segundo Nuevo Cine Iraní, cada nuevo film enriquece al mundo. Lo amplía en la medida en que da cuenta de que en la era de las superproducciones y de las predicciones apocalípticas respecto del futuro del cine, todavía hay imágenes para explorar. Como dice, flotando sobre el film, un viejo proverbio iraní: "Ver no es lo mismo que mirar".
Each of us submitted an essay on innovation and growth in advance for the Gruter Institute Conference on Growth. I’ll append mine below.
(photo by John Chisholm. More below).
Discussion ensued over lunch, and one of my favorite authors, Matt Ridley wrote a summary for the WSJ “Why Can't Things Get Better Faster (or Slower)?”
------------------------------------
Innovation and Growth — Evolving Cities and Culture
By Steve Jurvetson
Innovation is critical to economic growth, progress, and the fate of the planet. Yet, it seems so random. But patterns emerge in the aggregate, and planners and politicians may be able to promote innovation and growth despite the overall inscrutability of this complex system. To tap the wisdom of crowds, we should shift the locus of learning from products to process. Leadership is not spotting the next growth industry, but tuning the parameters of human communication.
One emergent pattern, spanning centuries, is that the pace of innovation is perpetually accelerating, and it is exogenous to the economy. Rather, it is the combinatorial explosion of possible innovation-pairings that creates economic growth. And that is why cities are the crucible of innovation.
Geoffrey West of the Santa Fe Institute argues that cities are an autocatalytic attractor and amplifier of innovation. People are more innovative and productive, on average, when they live in a city because ideas can cross-pollinate more easily. Proximity promotes propinquity and the promiscuity of what Matt Ridley calls “ideas having sex”. This positive network effect drives another positive feedback loop - by attracting the best and the brightest to flock to the salon of mind, the memeplex of modernity.
Cities are a structural manifestation of the long arc of evolutionary indirection, whereby the vector of improvement has risen steadily up the ladder of abstractions from chemicals to genes to systems to networks. At each step, the pace of progress has leapt forward, making the prior vectors seem glacial in comparison – rather we now see the nature of DNA and even a neuron as a static variable in modern times. Now, it’s all about the ideas - the culture and the networks of humanity. We have moved from genetic to mimetic evolution, and much like the long-spanning neuron (which took us beyond nearest neighbor and broadcast signaling among cells) ushering the Cambrian explosion of differentiated and enormous body plans, the Internet brings long-spanning links between humans, engendering an explosion in idea space, straddling isolated pools of thought.
And it’s just beginning. In the next 10 years, three billion minds will come online for the first time to join this global conversation (Diamandis).
But why does this drive innovation and accelerating change? Start with Brian Arthur’s observation that all new technologies are combinations of technologies that already exist. Innovation does not occur in a vacuum; it is a combination of ideas from before. In any academic field, the advances today are built on a large edifice of history. This is the foundation of progress, something that was not so evident to the casual observer before the age of science. Science tuned the process parameters for innovation, and became the best method for a culture to learn.
From this conceptual base, come the origin of economic growth and accelerating technological change, as the combinatorial explosion of possible idea pairings grows exponentially as new ideas come into the mix (on the order of 2^n of possible groupings per Reed’s Law). It explains the innovative power of urbanization and networked globalization. And it explains why interdisciplinary ideas are so powerfully disruptive; it is like the differential immunity of epidemiology, whereby islands of cognitive isolation (e.g., academic disciplines) are vulnerable to disruptive memes hopping across, much like South America was to smallpox from Cortés and the Conquistadors. If disruption is what you seek, cognitive island-hopping is good place to start, mining the interstices between academic disciplines.
So what evidence do we have of accelerating technological change? At DFJ, we see it in the diversity and quality of the entrepreneurial ideas arriving each year across our global offices. Scientists do not slow their thinking during recessions. For a good mental model of the pace of innovation, consider Moore’s Law in the abstract – the annual doubling of compute power or data storage. As Ray Kurzweil has plotted, the smooth pace of exponential progress spans from 1890 to 2012, across countless innovations, technology substrates, and human dramas — with most contributors completely unaware that they were fitting to a curve.
Moore’s Law is a primary driver of disruptive innovation – such as the iPod usurping the Sony Walkman franchise – and it drives not only IT and communications, but also now genomics, medical imaging and the life sciences in general. As Moore’s Law crosses critical thresholds, a formerly lab science of trial and error experimentation becomes a simulation science and the pace of progress accelerates dramatically, creating opportunities for new entrants in new industries. And so the industries impacted by the latest wave of tech entrepreneurs are more diverse, and an order of magnitude larger — from automobiles and rockets to energy and chemicals.
At the cutting edge of computational capture is biology; we are actively reengineering the information systems of biology and creating synthetic microbes whose DNA was manufactured from bare computer code and an organic chemistry printer. But what to build? So far, we largely copy large tracts of code from nature. But the question spans across all the complex systems that we might wish to build, from cities to designer microbes, to computer intelligence.
As these systems transcend human comprehension, will we continue to design them or will we increasingly evolve them? As we design for evolvability, the locus of learning shifts from the artifacts themselves to the process that created them. There is no mathematical shortcut for the decomposition of a neural network or genetic program, no way to "reverse evolve" with the ease that we can reverse engineer the artifacts of purposeful design. The beauty of compounding iterative algorithms (evolution, fractals, organic growth, art) derives from their irreducibility. (My Google Tech Talk goes into some detail on the dichotomy of design and evolution).
The corporation is a complex system that seeks to perpetually innovate. Leadership in these complex organizations shifts from direction setting to a wisdom of crowds. And the process learning is a bit counterintuitive to some alpha leaders: cognitive diversity is more important than ability, disagreement is more important than consensus, voting policies and team size are more important than the coherence or comprehensibility of the decisions, and tuning the parameters of communication (frequency and fanout) is more important than charisma.
The same could be said for urban planning. How will cities be built and iterated upon? Who will make those decisions and how? We are just starting to see the shimmering refractions of the hive mind of human culture, and now we want to redesign the hives themselves to optimize the emergent complexity within. Perhaps the best we can do is set up the grand co-evolutionary dance, and listen carefully for the sociobiology of supra-human sentience.
I am at the TTI/Vanguard Next conference (agenda), with a sophisticated audience of tech executives from around the world. Of the topics I covered, the Q&A interest focused on iterative algorithms that will create an AI that exceeds human intelligence, much like biological evolution. (video)
Here are some of the related bullets from my slides:
Reed's Law applies to combinations of ideas as well as self-forming groups. It's the combinatorial explosion in the mating pool of ideas that creates perpetually accelerating progress.
Evolutionary algorithms allow us to build complex systems that exceed human understanding (synthetic biology, AI, innovative organizations), but there are some limitations to this approach:
• Subsystem Inscrutability
- Black box defined by its interfaces
- No “reverse evolution” (You can't run that algorithm backwards)
• No simple shortcuts across the iterations
- Simulation ~ Reality
- Beauty from irreducibility
• Locus of Learning is Process, not Product
• Robust, within co-evolutionary islands
“The greatest achievement of our technology may well be the creation of tools that allow us to go beyond engineering – that allow us to create more than we can understand.” — Danny Hillis
“We actually think quantum machine learning may provide the most creative
problem-solving process under the known laws of physics.” — Google Blog
AI implications:
• Cut & Paste Portability?
• Locus of learning: Process, not Product
- Would we bother to reverse engineer?
- No hard take off?
•Co-evolutionary islands
- accustomed environment (differential immunity)
• Path dependence
- algorithm survival
- AI = Alien Intelligence defined by sensory I/O
Accelerating Technological Change
- Interdisciplinary Renaissance
- IT innervates $T markets
- More Black Swans
- Perpetual driver of disruption
==> Virtuous cycle for entrepreneurs
==> a great time for the new
Comments from others that followed:
“The majority of financial reports are now compiled by machines, not people.”
“A lot of the great data scientists are born in Russia, and they have the attributes of creativity, tenacity and an ability to code.”
“When we asked 1000 people on Mechanical Turk to flip a coin, we got 65% heads, 28% tails, and 7% typos. Many of them clearly did not actually flip a coin.”
“Imagine the sociological impact of crowdsourcing – what if you could create IBM for an afternoon and then disperse it? We might get cyber-Taylorism if we don’t think about doing it right.”
“Competition will be critical to the wisdom of crowds.”
Combinatorial Creativity: “Combinatorial search spaces are vast and the fastest supercomputers can not penetrate too deeply into them. Nevertheless, they may be able to penetrate several levels deeper than any person can, and thereby find superb creative acts that mankind did not or could not think of.”
Pointer to CHM video on the history of AI.
Photos by Ed Jay
The current Forbes cover reminds me of the longevity confab that Joon Yun pulled together yesterday. I pulled slides together just prior to going on stage, and in retrospect, I might have titled it as I do here for the photo caption. Here’s a FB video from the audience, starting one minute in. Thanks to Asa for the photo.
Here are some notes from longevitycrossroads.org before the fire marshall kicked us out.
Joon: “any event that becomes a fire hazard is probably worth doing. I have to imagine that that was on the mind of the Burning Man founders.”
Elizabeth Blackburn, President, Salk Institute & Nobel Prize winner for discovering the molecular nature of telomeres:
Yeast cells divide only 25 times and then stop. Why? What happens? Catastrophic systems failure.
Telomere tips protect the cell DNA. The code will no longer replenish the cell. They become like little rotten apple, spitting out inflammatory chemicals. If you clear these undead cells out, then mice stay healthy. This is behind senescence.
Our germ line cells know how to generate a fresh new baby with extended telomeres. So, there is hope.
The longest living human kept smoking well past 110 years old. She only stopped when she could no longer see well enough to light a cigarette. We don’t know how long we could live.
Dr. Eric Verdin, CEO of Buck institute:
How aging research will disrupt medicine
C.Elegans. DAF-2 gene modification, 2x lifespan (from 21 day base). Modify same DAF-2 gene and removal of gonad -> 6x lifespan. For those squirming in seat, we’re not thinking about removing gonads in humans. Caloric restriction in mice -> dramatic lifespan extension
Okinawa, Japan: People there have the longest life expectancy and the most centenarians. Local saying: “Eat only until 80% full.” Self-imposed caloric restriction.
Life expectancy drops from 82 to 65 years when Okinawans live in Brazil.
TOR and insulin signaling. Related to caloric restriction
Today : focused on diseases and organs
Future: preventative, reparative, centered on Aging pathways, multi-organ
Kicking off the D-Wave Board meeting over lunch today at Goldman Sachs… with new news from Google that they demonstrated the use of D-Wave’s quantum computer to deliver photo-driven search (and improve on classical machine learning).
Here is a summary from the Google Reseach blog:
“Many Google services we offer depend on sophisticated artificial intelligence technologies such as machine learning or pattern recognition. If one takes a closer look at such capabilities one realizes that they often require the solution of what mathematicians call hard combinatorial optimization problems. It turns out that solving the hardest of such problems requires server farms so large that they can never be built. A new type of machine, a so-called quantum computer, can help here.
Today, at the Neural Information Processing Systems conference (NIPS 2009), we show the progress we have made. We demonstrate a detector that has learned to spot cars by looking at example pictures. It was trained with adiabatic quantum optimization using a D-Wave C4 Chimera chip. There are still many open questions but in our experiments we observed that this detector performs better than those we had trained using classical solvers running on the computers we have in our data centers today. Besides progress in engineering synthetic intelligence we hope that improved mastery of quantum computing will also increase our appreciation for the structure of reality as described by the laws of quantum physics.”
I experimented with a mid-length exposure here, as it was a blend of day and night shots.
OK, ok, enough about rockets.
I need to head out and try to think like a techonomist.
When we consider the combinatorial explosion of possibly interacting ideas as the fountainhead of innovation, it not only creates the economy and explains accelerating change, it also subsumes biological evolution (raising the primary vector of progress to a higher level of abstraction) and nurtures a rational optimism for the future.
(blending Adam Smith, Matt Ridley, Richard Dawkins, Ray Kurzweil & Brian Arthur)
And stitched together by some fine dinner conversation with Matt Ridley (just before his TED Talk):
“Self-sufficient is another way of saying impoverished.”
“Innovation = ideas having sex.”
“There is literally nobody on the planet who knows how to make a computer mouse.”
This is a study tool that I came up with while thinking about zonohedra awhile back. Just as planar rhombus tilings can be generated by grids of lines (see this article for more info), convex zonohedra seem to have a relationship with great circles placed on a sphere. I've no idea what makes it work, but just by playing around with some a racquetball and some rubber bands, I made some pretty cool shapes and generated a (hopefully complete) list of all combinatorially distinct 6-sided and 12-sided convex zonohedra, so I'm pretty sure it works.
The reason I'm bringing it up now, instead of waiting until I finish folding the model and do a writeup, is that this maquette also happens to be useful in an explanation I posted in response to a question by Byriah Loper.
The image shows a 3D-rendering (Imaris software) of a live confluent culture of NIH-3T3 cells obtained using confocal microscopy. The cells were co-transduced with 5 fluorescent proteins and with Lentiviral Gene Ontology (LeGO) vectors expressing Cerulean (blue), EGFP (green), Venus (yellow), tdTomato (magenta) or mCherry (red) fluorescent proteins to provide combinatorial colors for progeny tracking. Groups of nearby cells of the same color descended from same stem cells.
Credit: Daniela Malide, Jean-Yves Metais, Cynthia Dunbar, National Institutes of Health
__________________________________________________
Outlining a Theory of General Creativity . .
. . on a 'Pataphysical projectory
Entropy ≥ Memory ● Creativity ²
__________________________________________________
Study of the day:
Il y aurait deux infra-conscients, le plus profond serait structuré comme un ensemble quelconque, pure multiplicité ou possibilité en général, mélange aléatoire de signes ; le moins profond serait recouvert des schémas combinatoires de cette multiplicité.
(Michel SERRES)
There would be two sub-conscious, the deeper would be structured as any set, pure multiplicity or possibility in general, random mixture of signs, the less deep would be covered by the combinatorial patterns of this multiplicity.
__________________________________________________
rectO-persO | E ≥ m.C² | co~errAnce | TiLt
Fangruida -- Modern Science and Technology Engineering and Comprehensive High-end Technology R&D, Design and Manufacturing (Introduction to Modern Science and Engineering Technology Research) 2013v2.3 2021v.2.5 Online global version, mobile version (Bick compiled in November 2021. Colombia) ♣♣♣♣Moon Comprehensive Deep Development♥♥♣Ocean City, Marine Architecture, ♣♣ Desert City, ♥♥♥ Mountain City, ♦♦♦Life Genetic Engineering, ♦♦♦♦Green Plant Nutrition Engineering●●●●●●● Smart Engineering; ♦♦♦♦♦♦ Nuclear Engineering - Peaceful Use of Nuclear Energy ●●●●●●Advanced Manufacturing●●●●●●● --New World Intelligence Revolution, New Industrial Revolution, New Planetary Revolution, New Moon Revolution, New Cosmic Revolution **************************************************** **************************************** Architecture Bridge design, large-scale circuit design (chip development, etc.), mechanical and electrical product design and manufacturing, pharmaceutical product development and design, genetic engineering, aerospace technology design and manufacturing, atomic energy development and utilization, agricultural engineering, computer-aided design and manufacturing, New material research and development design, military Engineering design and manufacturing, industrial robots, aircraft and ships, missiles, spacecraft, spaceships, rockets, submarines, super-speed missiles, etc. are very important, and the foresight is highly integrated. the key. These science and technology are the powerful driving force of historical development, and also the key to whether each country can reach the peak of the world. The rapid development of modern science, all kinds of soft design emerge in an endless stream. Mathematical software, civil software, mechanical software, electrical and electronic software, chemical software, aircraft software, ship software, missile software, spacecraft software, rocket software, material software, bionic simulation software, medical software, chemical software, etc. Their appearance and wide application are of great significance to industrial modernization and intelligence, which greatly improves artificial intelligence and greatly promotes the rapid development of human society. Marine engineering, overall lunar development engineering, intelligent highly integrated engineering, high-speed heavy-duty fire Arrow transportation engineering, submarine tunnel engineering, reservoir dam engineering, agricultural engineering, biomedical engineering and so on. Lunar overall engineering development planning, Mars engineering development and design, desert engineering (desert city), alpine city, marine engineering (ocean city) life genetic engineering, green plant nutrition engineering, VLSI design and manufacturing, Daxing civil engineering hydraulic engineering, road and bridge , tunnels, super tall buildings, all of them. The modern scientific revolution is guided by the revolution in physics, with the emergence of modern cosmology, molecular biology, systems science, and soft science as its important content, and is characterized by the interpenetration of natural science, social science and thinking science to form interdisciplinary subjects. scientific revolution.
In the past 30 years, emerging technologies such as computers, energy, new materials, space, and biology have emerged successively, causing the third scientific and technological revolution. The third technological revolution far exceeds the previous two in terms of scale, depth and impact. Basic Features: 1. Greatly promoted the development of social productive forces — changes in the means to improve labor productivity; 2. Promoting changes in the social and economic structure and social life structure - the proportion of the tertiary industry has increased. Changes in people's daily life such as food, clothing, housing and transportation; 3. It has promoted the adjustment of the international economic structure - localities are more closely connected. 4. Planetary revolution, lunar revolution. Lunar engineering Lunar industrial intelligent city Lunar-Earth round-trip communication system We should develop the moon fast, it's a real cornering overtake. The physical presence of the moon will be of great strategic importance for thousands of years to come. There are many resources on a first-come, first-served basis, orbits, best lunar locations, electromagnetic wave bands, etc. Make full use of the local resources and environment of the moon to quickly build a city. Minimize the amount of supplies and equipment that needs to be launched to the Moon. 5. Ocean City, Ocean Building, ♣♣ Desert City, ♥♥♥ Mountain City 6. Life genetic engineering, drug research and development 7 Green Plant Nutrition Engineering 8 Smart Engineering 9 Nuclear Engineering 10 Advanced Manufacturing Engineering The rapid development of modern science and technology, with each passing day, all kinds of inventions and creations, all kinds of technological innovations are numerous. However, the most important and most relevant technical fields mainly include lunar engineering, lunar industrial intelligent city, lunar-earth round-trip communication system, Radius: 1737 km; Ocean City, Ocean Building, ♣♣ Desert City, ♥♥♥ Mountain City 6. Life genetic engineering, drug research and development 7 Green Plant Nutrition Engineering 8 Smart Engineering 9 Nuclear Engineering 10 Advanced Manufacturing Engineering and others. It is in these fields and categories that the development competition among countries is nothing more than. Of course, military, aerospace, etc. are also among them. Scientific discoveries can last for thousands of years, and technological inventions can be kept fresh for only a few decades, and they will be obsolete in a few hundred years. Such as electronic product updates, quite quickly. Life cycles are short, as are smart cars, smartphones, etc. Of course, the technological limit may also reach hundreds of years. Even scientific discoveries are not permanent. Tens of thousands of years later, people will have a new leap in understanding the universe and natural laws of natural phenomena. For example, people are on the moon and on Mars, and the human wisdom finds that the invention of wisdom is unbelievable. For us
people on earth, we have become uncivilized ancient human beings. The intelligence quotient of lunar humans is dozens and hundreds of times that of our current Earth humans. The scientific discovery of that time was unimaginable. Mathematical, physical and chemical, natural, agricultural, medical, industrial, legal and commercial, literature, history, philosophy, classics, education, etc., everything will be renovated and mutated. math The science of studying quantitative relationships and spatial forms in the real world. It is produced and developed in the long-term practical activities of human beings. Originated from counting and measurement, with the development of productive forces, more and more quantitative research on natural phenomena is required; at the same time, due to the development of mathematics itself, it has a high degree of abstraction, rigorous logic and wide applicability. It is roughly divided into two categories: basic mathematics (also known as pure mathematics) and applied mathematics. The former includes branches such as mathematical logic, number theory, algebra, geometry, topology, function theory, functional analysis and differential equations; the latter includes branches such as probability theory, mathematical statistics, computational mathematics, operations research and combinatorial mathematics ■■■Basic technical sciences, mainly including civil engineering, electromechanical engineering, chemical engineering, information engineering, aerospace engineering, ocean engineering, mining engineering, medical engineering, materials engineering, computational engineering, agricultural engineering, energy engineering, lunar engineering, Mars engineering , life engineering and so on. . Computational mathematics and its application software This major trains students to master the basic theories, basic knowledge and basic methods of mathematical science, to have the ability to apply mathematical knowledge and use computers to solve practical problems, and to be able to engage in research, teaching or production in the departments of science and technology, education and economics Senior talents engaged in practical application and management in operation and management departments. This major in computer software is to cultivate all-round development of morality, intelligence, physique, beauty, labor, etc., master certain professional theoretical knowledge, basic knowledge and basic skills of computer programming and application, and be proficient in using the latest international popular software development environment and tools. , Familiar with international software development norms, have strong software development practice ability and good software engineering literacy. Modern mathematics is a edifice built from a series of abstract structures. It is based on the innate belief of human beings in the inevitability and accuracy of mathematical reasoning, and it is the concentrated expression of confidence in the capacity, origin and power of human reason. Deductive reasoning based on self-evident axioms is absolutely reliable, that is, if an axiom is true, then the conclusions deduced from it must also be true. By applying these seemingly clear, correct, and perfect logics, mathematicians The conclusions reached are clearly unquestionable and irrefutable. Naturally, mathematics is constantly developing and alienating, and eternal mathematics is also unrealistic, mainly due to the changes in the logical thinking structure of the human brain, and mathematics will continue to mutate or alienate. Mathematical logic, natural logic, image logic, hybrid compound logic. In fact, the above-mentioned understanding of the essential characteristics of mathematics is
carried out from the aspects of the source, the way of existence, and the level of abstraction of mathematics, and the essential characteristics of mathematics are mainly seen from the results of mathematical research. Common general-purpose mathematical software packages include: Matlab, Mathematica and Maple, where Matlab is good at numerical calculation, while Mathematica and Maple are good at symbolic operation and formula derivation (2) Dedicated math packages include: Drawing software: MathCAD, Tecplot, IDL, Surfer, Origin, SmartDraw, DSP2000 Numerical computing class: Matcom, DataFit, S-Spline, Lindo, Lingo, O-Matrix, Scilab, Octave Numerical calculation library: linpack/lapack/BLAS/GERMS/IMSL/CXML Finite element calculation classes: ANSYS, MARC, PARSTRAN, FLUENT, FEMLAB, FlexPDE, Algor, COSMOS, ABAQUS, ADINA Mathematical statistics: GAUSS, SPSS, SAS, Splus Obviously, the result (as a deductive system of the theory) does not reflect the whole picture of mathematics, another very important aspect that constitutes the whole of mathematics is the process of mathematical research, and in general, mathematics is a dynamic process, a " The experimental process of thinking" is the abstract generalization process of mathematical truth. The logical deductive system is a natural result of this process. In the process of mathematical research, the richness of mathematical objects, the invention of mathematics by human beings, "Mathematics is a language", mathematical activities are social, it is in the historical process of the development of human civilization, human beings understand nature, adapt to It is the crystallization of a high degree of wisdom that transforms nature and improves self and society. Mathematics has a key influence on the way of thinking of human beings. It is of great significance. Mathematics, physics and chemistry, mathematics is the first priority, and it is not an exaggeration. Based on the above understanding of the essential characteristics of mathematics, people also discussed the specific characteristics of mathematics from different aspects. The more general view is that mathematics has the characteristics of abstraction, precision and extensive application, among which the most essential characteristic is abstraction. In addition, from the perspective of the process of mathematical research and the relationship between mathematics and other disciplines, mathematics also has imagery, plausibility, and quasi-experience. The "falsifiability" feature of Matlab is suitable for the engineering world, especially toolboxes, fast code, and many integrations with third-party software, such as optimization toolboxes The most obvious third party is comsol Mathematica syntax is excellent, so good that it comes with almost all programming paradigms . The understanding of the characteristics of mathematics is also characteristic of the times. For example, regarding the rigor of mathematics, there are different standards in each period of mathematics historical development, from Euclidean geometry to Lobachevsky geometry to the Hilbert axiom system. , the evaluation criteria for rigor vary widely, especially when Gödel proposed and proved the "incompleteness theorem... Later, it was found that even axiomatic, a rigorous scientific method that was once highly regarded, was flawed. Therefore, the rigor of mathematics is shown in the history of mathematics development and has a relativity. Regarding the plausibility of mathematics, ◆◆◆ Mathematics is the tool and means of physical research. Some research methods of physics have strong mathematical ideas, so the process of learning physics can also improve
mathematical cognition. Mathematical logic is the study of symbolic and mathematical logic in formal logic. Mathematical logic is also called symbolic logic and theoretical logic. It is both a branch of mathematics and a branch of logic. It is the study of logic or formal logic using mathematical methods. The research object is the formal system after symbolizing the two intuitive concepts of proof and calculation. Mathematical logic is an integral part of the foundation of mathematics. Although the name has the word logic, it does not belong to the category of pure logic. Mathematical logic is the product of the development of modern Western logic. Generally speaking, it is predicate logic, which is to introduce mathematical methods into logic. Mathematical logic also mainly focuses on form, not content, but the method has changed. For example, all S are P. Mathematical logic can transform this sentence into that there is an x. If this x is S, then this x is P. Physics is a discipline that is close to exploring the origin of the world, so it has connections with many disciplines. As big as the movement of celestial bodies in geography, the spring, summer, autumn and winter of the earth, etc., as small as the gain and loss of electrons in chemical reactions in chemistry, etc.; physical calculations require mathematics, and calculus in mathematics is created by Newton to study gravity. Both formal logic and mathematical logic focus on conceptual extension and agree with the law of identity, but the analysis methods for sentences are different. Mathematics plays an important role in the development of physics, and physics also plays an important role in the development of mathematics: Explanatory language has accuracy, rigor, scientificity, plainness, thoroughness, and natural logic. The accuracy, rigor, plainness, thoroughness and scientificity of expository language are prerequisites for expository language. Representation of time, space, quantity, scope, degree, characteristics, nature, procedures, etc., are required to be accurate. The practicability of the description is very strong, and if there is a slight error, it will be missed by an inch or a thousand miles. Under the premise of accuracy, some of the language of the description is known for its plainness, and some is known for its liveliness. Due to the difference between the object of the description and the language style of the author, the language of the description is also varied and complex. In fact, the connections and differences between scientists and engineers are much more than that. In fact, the boundaries between engineers and scientists can be completely broken. Some outstanding scientists are also outstanding engineers, and some outstanding engineers often do the work of scientists. Properties of complex numbers, functions of complex variables, analytic functions, integrals of functions of complex variables, power series over complex number fields, Taylor series of analytic functions, Lorent series, singularities, residues and their calculations; string vibration equations, heat conduction Equations and potential equations, classification of second-order linear equations, traveling wave method for solving string vibration equations, two-dimensional and three-dimensional wave equations, separation of variables solution, Bessel function, Legendre polynomial and their properties, expansion of functions by characteristic
functions, Fourier transform , Laplace transform, generalized function and its Fourier transform, Green function method, variational problem, Sobolev space and weak solution, finite element solution method of boundary value problem, total stiffness matrix and total load matrix, programming finite element solution method with Mathematica In addition, mathematical physics equations and special functions are also an important branch of engineering mathematics. vector algebra, vector analysis, tensor analysis Matrix Algebra, Matrix Analysis Analytical Geometry, Differential Geometry Functional Analysis, Variational Methods Ordinary Differential Equations, Partial Differential Equations optimal method Graph and network models Stochastic Mathematics (Probability, Statistics, Stochastic Processes) Computational intelligence (ANN, GA, SVM, etc.) models Pattern Recognition, Machine Learning, Data Mining The main representative of , author of "Nine Chapter Collection" 6. Plato, an ancient Greek philosopher, a student of Socrates, author of "Socrates' Defence", "Utopia", "Parmenides", "The Wise Men" and other dialogues. 7. Aristotle, a student of Plato, a master of Greek philosophy, an encyclopedic philosopher, the founder of many disciplines, his representative works "Theory of Tools", "Physics", "Metaphysics", "Nicoma" Ethics", "Politics". 8. Epicurus, the ancient Greek philosopher, one of the founders of euphoria ethics. 9. Pyrrho, the ancient Greek philosopher and founder of skepticism. 10. Plotinus, the late Greek philosopher, Egyptian, the main representative of Neoplatonism, author of "Nine Chapters" 11. Marx, Hegel, Kant, etc. From the 16th century to the 19th century, many scientists were born in England. 1. Newton created a complete system of mechanics theory. 2. Faraday discovered the principle of electromagnetic induction. 3. Dalton founded the modern atomic theory. 4. Darwin created the biological evolution theory "Origin of Species"). The proportion of the world's top scientists in each country; 1. United States: 1,465 people. accounted for 47.5%. 2. United Kingdom: 346 people. accounted for .2%. 3. Germany; 177 people. accounted for
5.7%. 4. China: 175 people. accounted for 5.7%. 5, Australia: 113 people. accounted for 3.7%. 6. Canada: 97 people. accounted for 3.1%. 7. The Netherlands: 94 people. accounted for 3%. 8. France: 89 people. accounted for 2.9%. 9. Japan: 74 people. 2.4% 10. Switzerland: 71 people. 2.3% (Quoted from web resources) Newton, Franklin, Darwin, Maxwell, Hertz, Bohr, Fermi, Marie Curie, Einstein, Heisenberg, Lorenz, Ampere, Pasteur, Watson Creek, Feynman, Oppenheim Mer, David, Faraday, Roentgen, Hahn, Thomson, Rayleigh, Haber . ▲▲▲▲ Surface area: 37.93 million square kilometers, smaller than Asia and larger than Africa; Equator Circumference: 10921 km Escape velocity: 2.38 km/s; circling velocity at a height of 10 km: 1.674 km/s Surface gravitational acceleration: 1.62 m/s square, about one-sixth of the earth Equatorial surface temperature: minimum -173°C, maximum 117°C, average -53°C Surface pressure: 1 trillionth of an atmosphere during the day, 1/10 millionth of an atmosphere at night, almost an absolute vacuum The main components of the surface rock: The thickness of the lunar crust is about 50 kilometers (earth direction), and the back is about 65 kilometers. The thickness of the lunar mantle is about 1200 kilometers, solid rock, and there is a lot of iron. The partially molten outer core is 260 kilometers thick, and the solid iron inner core has a radius of 240 kilometers. There is no air on the surface of the moon, cosmic radiation is heavy, and there is damage from small meteors, so the city should be built below the surface, or a very thick dome. Other measures and methods can also be taken. The materials used to build cities on the moon should of course use all local resources, that is, the soil, rocks, etc. of the moon. If a large amount of metal materials is required, the lunar soil should also be used directly for smelting. Astronomy: It is a discipline that studies the structure and development of celestial bodies in space and the universe. Cosmology: The study of the universe, which also studies the position of human beings in the universe, and the study of the large-scale structure and evolution of the universe Astronomy: The content
includes the structure, properties and running laws of celestial bodies. Astronomy is an ancient science, and it has played an important role since the history of human civilization. Cosmology: The study of the origins of observable structures in the universe, from giant galaxy clusters to the solar system, falls within the field of celestial evolution. Fundamental questions to be addressed include when and how the universe began, how galaxies formed and acquired the shapes and sizes we observe, how stars were born, how planets and life evolved, and more. planetary hierarchy Including planets in planetary systems, satellites revolving around planets, and a large number of small celestial bodies, such as asteroids, comets, meteoroids, and interplanetary matter. star system. 2. Stellar-level lunar soil and lunar rocks: The lunar surface is covered with a layer of lunar soil, which is rock debris, powder, breccia, impact molten glass and volcanoes formed by long-term meteorite and micrometeorite impacts and the accumulation of their sputtered materials Soil layer composed of glass. Internal structure of the moon: According to the records of natural moonquakes and large meteorites hitting the lunar surface, it is proved that the moon has a crust-like structure inside. The thickness of the front lunar crust is about 50km, and the thickness of the back is about 72km; the thickness of the lunar lithosphere can extend to a depth of at least 1000km. According to the study of the conductance profile inside the moon, the radius of the lunar metal core is about 360km; according to the measurement of the lunar magnetic field, the radius of the moon core is about 400-500km; the maximum temperature inside the moon does not exceed 1300 ℃, which does not reach the temperature of material melting. ……………………………………………………………………… ………………… If humans want to live normally on the lunar surface, they must first of all be inseparable from the essential fresh water and oxygen, and there is neither water nor empty lunar base gas on the moon. What to do about this? Scientists found that the lunar sand
contains a lot of oxygen, and they proposed the idea of using the lunar sand to produce fresh water and oxygen. The idea is to first use a forklift to automatically excavate the sandy soil on the lunar surface, select the oxygen-containing iron minerals from it, and then use hydrogen to reduce the oxygen-containing iron minerals to obtain fresh water. With water, electricity is applied to electrolyze the water to obtain oxygen and hydrogen. Oxygen is liquefied and stored, ready to be supplied to base residents. The hydrogen initially used as a reducing agent can be shipped from the earth, and the hydrogen obtained by electrolysis of water can be recycled after production begins. Second, for humans to live in a self-sufficient system on the moon, food supplies must also be guaranteed. Where does the food come from? In recent years, scientists have carried out a large number of biological experiments on the space station, and have cultivated more than 100 "space plants", including wheat, corn, oats, soybeans, tomatoes, radishes, cabbage, beets, etc. Moreover, it has been proved that under the conditions of zero gravity in space, the germination rate of plant seeds in lunar soil is higher, the growth rate is faster, and the flowering or heading time is earlier. Therefore, as long as a lunar agriculture and aquaculture base is established on the moon, the food source of the lunar base for people on the moon is fully guaranteed. The third is that the energy supply of the lunar base is not a problem. Because there is no wind or rain on the moon, it is clear and sunny, there is sunshine all day long, and there is no atmospheric absorption, the radiation intensity of the sun is about 1.5 times that of the earth. Therefore, it is completely possible to use solar energy for lighting, heating, heating and power generation on the moon. Of course, nuclear power plants can also be built on the moon if necessary to ensure an adequate supply of energy for the base. Without air resistance, the shape of transport equipment and cargo is only limited by the design size of the transport system itself, not by aerodynamics. , rail transportation: Rail transportation has less resistance. The rails for orbital transportation can be constructed with
aluminum, iron, titanium and other elements that are very abundant on the lunar surface. The earth and stone building materials can be melted, cut into pieces, and bonded with lunar soil or lunar rock. . In the 30-50 years after 2030, local mining, smelting, energy and other industries on the moon can gradually become large-scale. Limited by the capacity of the earth to launch into space, the industrialization of the moon mainly relies on the accumulation of local materials. The main industrial categories include energy (power generation and heat collection), mining, smelting, and processing. The initial solar equipment has to be transported. This is the original bulk shipment. The power of solar panels on the moon is higher than that on Earth, The United States' Artemis project plans to send astronauts to the moon in 2024, and Russia will also send astronauts to orbit the moon in 2025. There are also many countries that have multiple probes or sample return projects before 2025, including China's Chang'e 6-7. But there are no ongoing plans for lunar mining and smelting. Even with a space station on the moon, current plans are unclear. The reasons include, of course, the need to prepare the groundwork for the successful implementation of the previous project, improve the launch and recovery technology, and more experience on the lunar surface. Within 50-100 years, the development of the moon will become an aerospace hotspot. In fact, at this stage, only the moon is likely to actually enter the development and immigration of human beings on Earth, and Mars is waiting for later. The moon is the first planet that human beings can conquer and set foot on. If human beings can't conquer even him, how can we conquer Mars? ▼▼▼▼▼ It has made outstanding achievements in the construction of ecological protection system in desert areas, restoration and reconstruction of degraded ecosystems, water conservancy engineering construction, transportation engineering construction, oil and gas exploration and development, land development and oasis construction, water-saving irrigation engineering, etc., which convincingly proves that : Human beings not only have the power to
overcome the harm of desertification, but also have the power to make the desert benefit mankind The ocean is the cradle of life and a treasure trove of resources, covering 71% of the earth's total surface and accounting for 97% of the earth's total water. With the continuous growth of the world population, the development of marine resources has become a strategic measure for human survival and development. In modern marine development activities, the development of marine oil and natural gas, marine transportation, marine fishing and sea salt production are huge in scale and output value. They are mature industries and are undergoing technological transformation and further expansion of production; marine aquaculture, seawater desalination, seawater Extraction of bromine and magnesium, tidal power generation, offshore factories, and undersea tunnels are developing rapidly; deep-sea mining, wave power generation, thermoelectric power generation, seawater uranium extraction, and offshore cities are being studied and tested for the development of seabed mineral resources, and marine engineering is very important. It is the most important source of life for all mankind. The protection and development of the ocean includes various measures, and the development of ocean cities is an important aspect of them. To develop a marine city is to build a city on the sea or under the sea, rather than developing a city near the sea as is usually thought. Nowadays, many coastal cities are very prosperous, and it is a natural idea and choice to continuously expand the area of coastal cities. However, under the background of global warming, the glaciers in the Arctic and Antarctic are melting, and the sea level is rising year by year. Coastal cities are not only facing submersion. Dangerous, and also facing the invasion of powerful typhoon and waves, simple expansion by reclamation is not a long-term solution. It can further expand the living space of human beings and develop and utilize the vast marine resources. In 2002, an American company envisioned building a sea city that could accommodate tens of thousands of people, namely the super cruise ship "Freedom". The design of the
"Freedom" was 1372 m in length, 229 m in width and 107 m in height, on the main deck. The building above is 25 stories high. The design goal of the "Freedom" is to go to the major oceans in the world to cruise, which is the embodiment of people's longing for the ocean. Marine engineering structural problems. Structural problems are the first problems to be solved by ocean cities. Whether it is a building on the water or underwater, how to resist the impact of storms, waves or huge water pressure, and ensure the stability, safety and reliability of the structure will become the first problem. For the durability of the project, it is necessary to select materials that are resistant to seawater corrosion, such as magnesium alloys or synthetic resin concrete. In order to live comfortably, the overall structure should not have too much shaking. Using rigid, flexible, durable, corrosion-resistant and high-strength composite materials to build a large-area semi-submersible platform, which is wind-resistant, wave-resistant, and earthquake-resistant. The new habitat, the platform is rigid-flexible structure and connection, 1000m-5000m, ground anchor and drop anchor, or sit on the bottom or semi-submersible, generally 50-100m in shallow sea, anti-earthquake, tsunami, storm and lightning protection ,Safe and reliable. To prevent fluttering and shaking, rigid, flexible, hard and soft measures are adopted. The marine city building materials are lightweight, durable, fire-resistant and corrosion-resistant, and the promenade and plank roads are readily available. The continental shelf is the most developed area of seabed sedimentation, and its sedimentary types and characteristics are restricted by environmental factors. Since the waters of the continental shelf are in a shallow sea environment, the factors affecting the deposition of the continental shelf are: 1. Sea level change; 2. Provenance supply; 3. Hydrodynamic conditions; 3. Climate and its fluctuations; 5. Detrital particle size; 6 7. Chemical factors; 8. Continental shelf topography; 9. Sea area openness; 10. Geological characteristics of surrounding land areas; 11. Tectonic background. 12. Earth evolution, etc. Ocean
engineering has great potential. living environment problems. How to provide an environment suitable for human survival, such as air, food, light, etc., is also an issue that needs to be carefully considered. Appropriate air is easy to satisfy for floating cities, but precise control is required for underwater cities. In order to maintain an air environment with a composition similar to that of the earth's atmosphere, a sufficiently robust air control system must be set up to maintain air pressure, temperature, humidity, etc., and must be carefully considered from the very beginning of the design. Sufficient electrical energy is the main form of energy. In addition to ensuring various activities and other needs of residents, it is also a necessary supporting condition for maintaining the living environment, such as air circulation and seawater desalination. In order to protect the ecological environment of the ocean, consideration should be given to using renewable and clean energy as much as possible. The turbulent waves of the ocean, the rising and falling tides, the huge ocean currents and the temperature difference between the upper and lower layers of the sea contain huge energy, so photovoltaic power generation, wind power generation, wave power generation, tidal power generation, ocean current power generation, ocean temperature difference power generation and other renewable energy Energy will be the main force of energy supply. If there is a shortage, it can be considered to supplement part of nuclear power generation. For floating cities, the development area is large, so photovoltaic power generation, wind power generation, and wave power generation can be the main force. For underwater cities, the power source is relatively scarce, and thermoelectric power generation, nuclear power can be considered, and some marine resources such as oil and natural gas can also be used. , combustible ice, etc. garbage disposal issues. How to recycle garbage, avoid environmental pollution, establish a recyclable economy, and build a reasonable and feasible sustainable development model will be the key control factors for building a marine city. For example, in the
concept of "Future City in the Maritime Environment", the garbage generated in the city is classified and processed to achieve partial reuse of resources. One is the fully recyclable garbage such as domestic garbage, sewage and carbon dioxide. First, the domestic sewage is treated and provided to plants for irrigation, and secondly, the domestic garbage is converted into fertilizer for plant factories through decomposition technology; , In addition to providing grain and meat products to residents, the residual processed waste will be further processed and provided to aquaculture in shallow sea areas as bait. Another type of waste that can be converted into energy, such as waste paper scraps, plastics, building materials, etc., is partially converted into fertilizers that can be used in agriculture, animal husbandry, and fisheries through special waste treatment plants, and part is used for the production of renewable materials and fuels . However, a small amount of garbage with great harm and pollution may still need to be transferred to land for centralized treatment or destruction to be harmless and pollution-free. With the improvement of technology and strict environmental protection measures, this part of garbage will become less and less. . Due to the complexity of the environment and the independence of effective operation, marine buildings in the future development of marine cities must meet the long-term settlement requirements of human beings. In addition to the above obvious problems, there are many other problems that need to be re-examined and carefully designed. Modern engineering is a huge challenge. Therefore, building a marine city is a multi-disciplinary, multi-professional, multi-field, and all-round integration system. There are many difficulties to overcome. The proposed scheme, new design and new concept, organically integrates various technologies under the concept of environmental protection, as a huge system. Only by considering the project can a more economical and feasible technical route be obtained, and the planning and realization of the ocean city can be continuously promoted. desert city, mountains
Set up drainage and seepage guide on the downstream backwater surface to discharge seepage water in time. In addition to the homogeneous earth dam, the dam body is mainly intercepted by the core wall, and the core wall is mainly composed of clay core wall, asphalt core wall, concrete core wall or other materials. The grouting method is the most widely used method for dam foundation seepage interception, and dam foundation grouting mainly selects different grouting methods according to the different geological conditions of the dam foundation. For large reservoirs, the design of the dam is very strict. When the dam foundation is rock and there are cracks, consolidation grouting should be used; when the dam foundation is rock or gravel foundation, if there is a leakage channel, curtain grouting should be used; when the dam foundation is Measures such as high-spray grouting shall be adopted for silty soil, silty clay stratum, silt, sand, gravel, gravel and other loose permeable foundations or filling bodies. structural design When designing a reservoir dam structure, the most basic requirement is to ensure the safety and stability of the structure. Therefore, it is necessary to extensively collect and organize relevant structural information during design to ensure the quality level of engineering design and improve the safety and stability of the reservoir dam structure. At the same time, in the structural design of dams, there are problems of uneven settlement caused by poor dam foundation and water conveyance tunnel foundation, and the compactness and bearing capacity do not meet the specifications. Uneven settlement leads to cracks in the dam body, cracks in the dam crest road surface, and rupture and damage of the culvert pipe of the water conveyance tunnel. Structural safety is paramount. Metal Structure Equipment In the design of reservoir dams, it is necessary to do a good job in the preliminary field investigation. By mastering accurate hydrogeological data, we can learn about the natural conditions of the construction site in detail, and choose metal structure equipment that is suitable for the actual local conditions. Control the quality of the equipment, and avoid rust and corrosion during the actual application of the equipment, so as to improve the quality of reservoir dam construction and better play the important role of reservoir dams. Redundant design and special accident safety design Spaceships, ships, dams, ultra-high buildings, large-section tunnels, extra-large bridges, atomic energy reactors, hazardous chemicals, earthquake, tsunami, volcanic eruption, safety protection, prediction, forecast and early warning system, extreme climate forecast system, Major infectious disease and plague prediction and early warning systems, program design of important software, R&D and design of important intelligent hardware, etc., are particularly important and require the full attention of scientists and engineers. On these important issues, we must redouble our efforts and intensify our efforts. world dam failure Dam collapse in Marpasse, France, 1959 The failure of the Malpasset dam is a vicious accident in the history of arch dam construction in the world, and it is more serious than the four previous dam accidents in the United States recorded in the records of modern dam accidents in the 1920s and 1930s. This dam is located on the Leyland River in the Var province in southern France. The dam was only built for water supply and irrigation. It was designed by André Coyne, a famous French civil engineer at the time. He led the construction of 70 dams in 14 countries in his lifetime. dam.
The dam is about 66 meters high and 223 meters wide at the crest. Construction started in 1952 and completed in 1954, using reinforced concrete. Due to the turbulent political situation in France at that time, it was put into operation as late as the end of 1958. In December 1959, the local area had continued torrential rain, and at noon on December 2, the reservoir reached its highest water level. Reservoir engineer André Ferro immediately asked to open the gates to discharge the flood, but the leaders were slow to approve. Until 6 pm that day, after the leaders approved the opening of the gate, the flood discharge speed was too slow, and the water level dropped by only a few centimeters in 3 hours, which was too late. At 9:00 p.m. that night, the Marbasi Dam suddenly collapsed. With a loud noise, a huge wave about 40 meters high carrying reinforced concrete fragments rushed out of the dam breach at a high speed of 70 kilometers per hour. The resulting huge air shock wave turned the small town of Fragers, about 10 kilometers downstream of the dam, into ruins in more than half an hour, and nearly all nearby buildings, roads, railways, power supply and water supply lines were washed away. into the sea more than ten kilometers away. According to official statistics, 423 people, including more than 100 children, were directly killed in the accident, many were missing, and many more were injured. "World's Best" Vaião Dam Landslide, Italy, 1963 Vajont Dam is located in the scenic Alps, less than 100 kilometers from the famous Venice. Italy entered a period of rapid development after the Second World War. The industrial development of the northern cities has an increasing demand for electricity. The Vaian Canyon dam construction has unique geographical conditions. As early as before World War II, the government and engineers have proposed to build power generation and reservoir functions. The dam's vision and engineering scheme. The design structure of the concrete double-curvature arch dam, which was finally used in the world's tallest dam at that time, had such excellent stress conditions that even after the disaster caused by the collapse of the reservoir, the dam remained standing. However, the mountains on both sides of the dam could not bear the weight of building dams and reservoirs. Image source: The International Commission on Large Dams (ICOLD) is the most authoritative international non-governmental academic organization recognized in the field of international dam engineering technology. It was established in 1928. The Dam Committee adopts a national membership mechanism. Currently, there are 104 national members, covering the countries where more than 95% of the world's reservoir dams are located. Its purpose is to promote technological progress in the planning, design, construction, operation and maintenance of reservoir dams and water conservancy and hydropower projects through the exchange of mutual information, including research on technical, economic, financial, environmental and social impact issues. Activities include technical exchanges between national committees, organizing conferences, annual meetings, executive meetings, sub-regional meetings and other meetings, organizing cooperative research and experiments, publishing collections of papers, technical bulletins, dam statistics and other documents. , GETTY IMAGES The dam was built in 1957 by a company that monopolized private power in northern Italy. Later,
the dam engineering company changed the original design. The height of the dam was increased from 230 meters to 262 meters, and the storage capacity was also increased to three times the original design. As the storage of the reservoir increased, the geological structure around the dam became unstable. At the end of 1962, the Italian national electric company bought the reservoir, in order to accept it as soon as possible and accelerate the water storage. From September 28, 1963, heavy rains continued to fall in the Vaian area. The speed of the landslide was increasing, and people nearby began to hear strange noises in the Vaian Valley. Authorities decided to lower the reservoir level, too late. On the night of October 9, 1963, a 260-million-cubic-meter landslide around the reservoir filled half of the reservoir within 45 seconds. The water that burst out instantly formed a 250-meter-high wave and an air shock wave like an atomic bomb explosion. It flooded nearby towns and villages, killing nearly 2,000 people. At that time, there were more than 60 technical and management personnel in the management building and office on the bank of the reservoir. Except for one person who survived, all the others died. despite the loss of water storage or generator energy. The intact dam has remained in place and has become a local tourist attraction. In 2008, when UNESCO launched the "International Year of the Earth", it listed the tragedy of the Vaian dam as one of the human engineering tragedies "due to the errors of engineers and geologists". In August 1975, a super typhoon brought Henan a torrential rain that broke the record of the highest rainfall in China and the world at that time, causing a major flood in the upper reaches of the Huai River, and a small reservoir began to collapse on August 6. In the early morning of August 8, two large reservoirs and nearly 60 small and medium-sized reservoirs collapsed one after another in just a few hours. Like a landslide, more than 600 million cubic meters of floodwater poured down, dozens of meters high and wide. The flood peak of about 12 kilometers inundated 29 counties and cities in Henan and Anhui provinces within a few hours. The book "The Great Flood in Chinese History" published in 1999 stated that 11 million people in Henan Province were affected and suffered heavy casualties. 17 million mu of farmland was flooded, 5.96 million houses collapsed, 302,300 farm animals and 720,000 pigs were washed away. , the Beijing-Guangzhou line running through the north and south of China was washed away 102 kilometers, interrupted for 18 days, affecting transportation for 48 days, known as the "75.8" flood. • ------------------------------------------------- -------------------------------------------------- -------------------------------------------------- ----------------------------- Civil engineering, roads and bridges, tunnels, dams, super high buildings, etc. 1. Engineering Feasibility Study (Gong Ke) - preliminary planning, key technology research, feasibility study from economic, technological and social development; 2. Preliminary design - select recommended and optimal plans from different design plans; mainly solve overall planning problems, including bridge location selection, bridge type, sub-holes, longitudinal and cross-sectional layout, main dimensions of the structure, project budget estimates, main The amount of materials; the preliminary design estimate is the basis for
controlling the investment of the construction project and compiling the construction budget; 3. Construction drawings - technical documents that further detail and specify the construction principles, technical plans, technical decisions, and total investment approved by the preliminary design; detailed analysis and calculation of each component of the bridge, drawing of construction drawings, and preparation of construction methods must be carried out. , construction material schedule and budget; Bridge Profile Design Including the total span of the bridge, sub-holes, bridge deck elevation, route, road longitudinal slope, foundation embedding depth and the method used; 1. The total span of the bridge - comprehensive consideration according to hydrological data, river bed erosion, foundation form, channel arrangement and cost; 2. The principle of split holes is that the total cost of the upper and lower structures is the most economical; comprehensively consider the influence of factors such as span, number of holes, structural system, combat readiness, etc.; three-span continuous beam 1: 0.8, five-span continuous beam 1: 0.9: 0.65 ; 3. Determination of bridge deck elevation - first meet the navigation requirements (navigation clearance), determined by the navigation department and determined by the design flood level; the bottom surface of the bearing is 25 cm higher than the design flood, and the bottom surface of the vault is 1 meter higher; the specific analysis of the flyover bridge ; The longitudinal slope of the bridge is not more than 4%, and the city is not more than 3%; the longitudinal slope of the approach bridge is not more than 5%, and vertical curves must be set at the changes; • Bridge cross-section design . Depends on the bridge deck width, structure type, cross-sectional arrangement; . Traffic lanes (7, 9) + sidewalks (1+NX0.5) + bicycle lanes (n*1) . Sidewalks and seat belts should be at least 20~25cm higher than the road surface, generally greater than 30cm .The cross slope of the bridge deck is 1.5%~3%, which is conducive to drainage; . Railings, guardrails, lamp post locations, bridge pipelines, etc. • Bridge layout . The alignment of the bridge should be as perpendicular to the river and the route under the bridge as possible to avoid oblique crossing; . When the oblique crossing is restricted, the oblique crossing angle is usually not more than 15 degrees, and it is not more than 5 degrees on navigable rivers; when the oblique crossing angle is large, special structural analysis and calculation shall be made; . The alignment between the alignment and the bridgehead approach road is smooth and conforms to the specifications;
• 1. On the premise of meeting the functional requirements, the best structural type should be selected - Pure, refreshing and stable. Quality is unified in beauty, and beauty is subordinate to quality. • 2. Beauty is mainly manifested in the harmony and good proportion of structure selection, and has a sense of order and rhythm. Too much repetition can lead to monotony. • 3. Pay attention to coordination with the environment. The choice of materials, the texture of the surface, and the use of special colors play an important role. Model checking helps with real-feeling judgment and examines shadow effects. • 4. A beautiful bridge should have a positive impact on people with its personality. Beauty and ethics are inherently interlinked. A beautiful environment will directly shape people's sentiments. The beauty of nature and the beauty of man-made environment are necessary for people's physical and mental health. • The determination of bridge structural form depends on in-depth comprehensive analysis and comparison of bridge technology, economy, and bridge construction conditions; • First determine the sub-holes according to the requirements of terrain, geology, navigation, etc., and draw up the bridge structure diagrams that may be designed; (usually 2~4) • Next, formulate the technical and economic indicators of each selected bridge structure form, including: main material consumption, total investment, construction period, operating conditions, maintenance costs, technical requirements for construction technology (whether there are difficult projects, etc.), special materials, etc.; and draw up the bridge structure the size of the main components; • Technical and economic comparison and optimal plan; comprehensively compare various indicators, determine the optimal plan based on the principles of applicability, economy and aesthetics, or recommend the first plan according to other objective conditions and special requirements. • Bridge codes are not necessarily the same across the world , it needs to be used according to the actual situation, and cannot be copied and used. The collapse of the Quebec Bridge in 1907 On August 29, 1907, a section of a newly built bridge in Quebec, Canada collapsed, falling into the St. Lawrence River. The collapse threw dozens of bridge builders and mechanics into the water and killed at least 80 people. The disaster happened just as workers were about to leave work that day, when a mile and a half or so of the bridge deck collapsed, causing a chain reaction of broken bridges and cables. A report later blamed the accident on the bridge's engineers. When the bridge was rebuilt in 1916, its structure collapsed again when it was hoisted back into place, killing 13 workers. • 1-3-1 Bridge Load Classification According to the probability of load occurrence:
Primary loads, secondary loads and special loads Highway design specification division: Permanent load, variable load, accidental load • During the service life of the bridge, the position, size, and direction of the load do not change with time or change very little and can be ignored. • Main Type Self-weight of main girder structure, deck pavement and ancillary facilities; Earth weight, earth pressure, internal and external prestressing, concrete Soil shrinkage, creep effect, foundation displacement effect, etc.; • Basic variable load (live load) ·Use loads of bridges: vehicles, people, and loads indirectly caused by vehicles · Cars, trailers, crawler vehicles, crowds (350) and special vehicles, consider centrifugal force (centrifugal coefficient V2/127R) and impact force (impact coefficient) for curved bridges • Other variables include: Vehicle braking force (related to the bearing, driving direction, 1.2 meters), bearing frictional resistance, temperature, wind load, water pressure, etc. ·Wind - small and medium bridges are calculated according to static wind pressure, and bridges are calculated according to dynamic force ·Automobile braking force - used when calculating bearings and piers ·Temperature influence—sunshine and annual temperature difference ·Support friction resistance, flowing water pressure and ice pressure - used when calculating bridge piers •Different countries have different norms, which are scientific and reasonable according to local conditions. Common problems in road and bridge design security issues road and bridge engineering★★★★★ -------------------------------------------------- -------------------------------------------------- --- • Tunnel engineering design Principles for selecting the location of tunnel engineering design a. The location of the tunnel should be selected in a stable stratum, and try to avoid crossing the extremely complex engineering geology and hydrogeology and serious bad geological areas; when it must be passed, there should be practical and reliable engineering measures. b. For long and extra-long tunnels and large-section tunnels crossing the watershed, the route direction and plane position shall be determined on the basis of geological surveying and comprehensive geological exploration of
Technical Session # 7 : Neural Network and Brain Modeling
The First Conference on Artificial General Intelligence (AGI-08) 1-3 March 2008
Recurrent Feedback Neuronal Networks: Classification and Inference Based on Network Structure by Tsvi Achler and Eyal Amir from Department of Computer Science, University of Illinois at Urbana Champaign
1. AI -> AGI,
2. [Recurrent Feedback Neuronal Networks] Avoids Combinatorial Complexity via Simple Connectivity,
The China-Brain Project: Building China's Artificial Brain Using An Evolved Neural Net Module Approach by Hugo de Garis , Tang Jian Yu, Huang Zhiyong, Bai Lu, Chen Cong, Chen Shuo, Guo Junfei, Tan Xianjin, Tian Hao, Tian Xiaohan, Wu Xianjian, Xiong Ye, Yu Xiangqian, Huang Di of The International School of Software at Wuhan University (This work is being funded in the future by Xiamen University, 2008-11)
3. Neural Net Accelerator Board for China's Artificial Brain,
4. Multi-module neural network evolution is a challenging new research field.,
How Might Probabilistic Reasoning Emerge from the Brain? Ben Goertzel and Cassio Pennachin of Novamente
5. How Might Probabilistic Reasoning Emerge from the Brain?,
Vector Symbolic Architectures: A New Building Material for Artificial General Intelligence by Simon Levy of Washington and Lee University and Ross Gayler
6. The Need for New Representational Principles
mosaic_NN_brain_agi08
The Postcard
A postcard bearing no publisher's name that was posted in Southend-on-Sea on Wednesday the 17th. July 1912 to:
Miss Richardson,
'Balmoral',
Marine Parade,
Barmouth,
N. Wales.
The pencilled message on the back of the card was as follows:
"Wed.
Dear Maudie,
Glad you are having a
fine time.
I walked along here
with Father yesterday.
The scenery is
indescribable.
Father uses the camera,
he hasn't used all the
plates yet.
We are most anxious to
see the results.
Much love to all,
Ella".
Westcliff-on-Sea
Westcliff-on-Sea is a suburb of Southend-on-Sea and a seaside resort in Essex in south-east England.
It is situated on the north bank of the Thames Estuary, about 34 miles (55 km) east of London.
The cliffs formed by erosion give views over the Thames Estuary towards the Kent coastline to the south. The coastline has been transformed into sandy beaches through the use of groynes and imported sand.
The estuary at this point has extensive mud flats. At low tide, the water typically retreats some 600 m from the beach, leaving the mud flats exposed.
The London, Tilbury and Southend Railway line arrived in the 1880's, connecting the town with London and shortening travel time.
-- Hamlet Court Road
The main shopping area in Westcliff-on-Sea is Hamlet Court Road, where the department store Havens, established in 1901, remained the anchor store until its closure in 2017.
Hamlet Court Road took its name from a manor house called Hamlet Court, which stood on land now occupied by Pavarotti's restaurant and the NatWest bank, facing towards the sea with sweeping gardens down to the rail line.
The road later developed into a strong independent retail area, and quickly became famous outside the area as the Bond Street of Essex. There were many haberdashers and specialist shops, and it was not unusual to see chauffeurs waiting for their employers to emerge from the shops.
The economic recessions of the 1980's and 90's saw the area decline. However the road underwent a £1 million regeneration in the early 2000's and a further regeneration in 2010. The street is now known for its large number of restaurants.
Henri Poincaré
So what else happened on the day that Ella posted the card to Maudie?
Well, the 17th. July 1912 was not a good day for Henri Poincaré, because he died in Paris on that day at the young age of 58.
Jules Henri Poincaré was a French mathematician, theoretical physicist, engineer, and philosopher of science. He is often described as a polymath, and in mathematics as "The Last Universalist", since he excelled in all fields of the discipline as it existed during his lifetime.
As a mathematician and physicist, he made many original contributions to pure and applied mathematics, mathematical physics, and celestial mechanics. In his research on the three-body problem, Poincaré became the first person to discover a chaotic deterministic system which laid the foundations of modern chaos theory. He is also considered to be one of the founders of the field of topology.
Poincaré emphasised the importance of paying attention to the invariance of laws of physics under different transformations, and was the first to present the Lorentz transformations in their modern symmetrical form.
Poincaré discovered the remaining relativistic velocity transformations, and recorded them in a letter to Hendrik Lorentz in 1905. Thus he obtained perfect invariance of all of Maxwell's equations, an important step in the formulation of the theory of special relativity.
In 1905, Poincaré first proposed gravitational waves (ondes gravifiques) emanating from a body and propagating at the speed of light as being required by the Lorentz transformations.
The Poincaré group used in physics and mathematics was named after him.
Early in the 20th. century he formulated the Poincaré conjecture that became over time one of the famous unsolved problems in mathematics until it was solved in 2002–2003 by Grigori Perelman.
-- Henri Poincaré - The Early Years
Poincaré was born on the 29th. April 1854 in the Cité Ducale neighborhood, Nancy, Meurthe-et-Moselle, into an influential French family. His father Léon Poincaré (1828–1892) was a professor of medicine at the University of Nancy.
His younger sister Aline married the spiritual philosopher Émile Boutroux. Another notable member of Henri's family was his cousin, Raymond Poincaré, a fellow member of the Académie Française, who was President of France from 1913 to 1920.
During his childhood Henri was seriously ill for a time with diphtheria, and received special instruction from his mother, Eugénie Launois (1830–1897).
In 1862, Henri entered the Lycée in Nancy. He spent eleven years at the Lycée, and during this time he proved to be one of the top students in every topic he studied. He excelled in written composition. His mathematics teacher described him as a "monster of mathematics," and he won first prizes in the Concours Général, a competition between the top pupils from all the Lycées across France.
Henri's poorest subjects were music and physical education, where he was described as "average at best". However, poor eyesight and a tendency towards absentmindedness may explain these difficulties.
He graduated from the Lycée in 1871 with a baccalauréat in both letters and sciences.
During the Franco-Prussian War of 1870, he served alongside his father in the Ambulance Corps.
Poincaré entered the École Polytechnique as the top qualifier in 1873 and graduated in 1875. There he studied mathematics as a student of Charles Hermite, continuing to excel and publishing his first paper (Démonstration nouvelle des propriétés de l'indicatrice d'une surface) in 1874.
From November 1875 to June 1878 he studied at the École des Mines, while continuing the study of mathematics in addition to the mining engineering syllabus, and received the degree of ordinary mining engineer in March 1879.
As a graduate of the École des Mines, he joined the Corps des Mines as an inspector for the Vesoul region in northeast France. He was on the scene of a mining disaster at Magny in August 1879 in which 18 miners died. He carried out the official investigation into the accident in a characteristically thorough and humane way.
At the same time, Poincaré was preparing for his doctorate in mathematics under the supervision of Charles Hermite. His doctoral thesis was in the field of differential equations. It was named Sur les propriétés des fonctions définies par les équations aux différences partielles.
Poincaré devised a new way of studying the properties of these equations. He not only faced the question of determining the integral of such equations, but also was the first person to study their general geometric properties. He realised that they could be used to model the behaviour of multiple bodies in free motion within the Solar System.
Poincaré graduated from the University of Paris in 1879.
-- Henri Poincaré's First Scientific Achievements
After receiving his doctorate, Poincaré began teaching as junior lecturer in mathematics at the University of Caen in Normandy. At the same time he published his first major article concerning the treatment of a class of automorphic functions.
In Caen he met his future wife, Louise Poulain d'Andecy (1857–1934), and on the 20th. April 1881, they married. Together they had four children: Jeanne (born 1887), Yvonne (born 1889), Henriette (born 1891), and Léon (born 1893).
Poincaré soon established himself as one of the greatest mathematicians of Europe. In 1881 he was invited to take a teaching position at the Faculty of Sciences of the University of Paris (the Sorbonne); he accepted the invitation, and for the rest of his career, he taught there. He was initially appointed as the associate professor of analysis. Eventually, he held the chairs of Physical and Experimental Mechanics, Mathematical Physics and Theory of Probability, and Celestial Mechanics and Astronomy.
In 1881–1882, Poincaré created a new branch of mathematics: qualitative theory of differential equations. He showed how it is possible to derive the most important information about the behavior of a family of solutions without having to solve the equation (since this may not always be possible). He successfully used this approach to problems in celestial mechanics and mathematical physics.
During the years 1883 to 1897, he taught mathematical analysis in the École Polytechnique.
-- Henri Poincaré's Career
Henri never fully abandoned his career in mining administration to mathematics. He worked at the Ministry of Public Services as an engineer in charge of northern railway development from 1881 to 1885. He eventually became chief engineer of the Corps des Mines in 1893, and inspector general in 1910.
In 1887, at the young age of 32, Poincaré was elected to the French Academy of Sciences. He became its president in 1906, and was elected to the Académie Française on the 5th. March 1908.
In 1887, he won the King of Sweden's mathematical competition for a resolution of the three-body problem concerning the free motion of multiple orbiting bodies.
In 1893, Poincaré joined the French Bureau des Longitudes, which engaged him in the synchronisation of time around the world. In 1897 Poincaré backed an unsuccessful proposal for the decimalisation of circular measure, and hence time and longitude.
It was this post which led him to consider the question of establishing international time zones and the synchronisation of time between bodies in relative motion.
In 1904, he intervened in the trial of Alfred Dreyfus, attacking the spurious scientific claims regarding evidence brought against Dreyfus.
Poincaré was the President of the Société Astronomique de France from 1901 to 1903.
-- The Death of Henri Poincaré
In 1912, Poincaré underwent surgery for a prostate problem and subsequently died from an embolism on the 17th. July 1912, in Paris. He was 58 years of age. He was laid to rest in the Poincaré family vault in the Cemetery of Montparnasse, Paris.
A former French Minister of Education, Claude Allègre, proposed in 2004 that Poincaré be reburied in the Panthéon in Paris, which is reserved for French citizens of the highest honour.
-- Overview of Henri Poincaré's Life
Poincaré made many contributions to different fields of pure and applied mathematics such as: celestial mechanics, fluid mechanics, optics, electricity, telegraphy, capillarity, elasticity, thermodynamics, potential theory, quantum theory, theory of relativity and physical cosmology.
He was also a populariser of mathematics and physics, and wrote several books for the lay public.
Among the specific topics to which he contributed are the following:
-- Algebraic topology (a field that Poincaré virtually invented)
-- The theory of analytic functions of several complex variables
-- The theory of abelian functions
-- Algebraic geometry
-- The Poincaré conjecture, proven in 2003 by Grigori Perelman
-- The Poincaré recurrence theorem
-- Hyperbolic geometry
-- Number theory
-- The three-body problem
-- The theory of diophantine equations
-- Electromagnetism
-- The special theory of relativity
-- The fundamental group
-- In the field of differential equations Poincaré has given many results that are critical for the qualitative theory of differential equations, for example the Poincaré sphere and the Poincaré map
-- Poincaré on "everybody's belief" in the Normal Law of Errors -- An influential paper providing a novel mathematical argument in support of quantum mechanics
-- Three-body problem. The problem of finding the general solution to the motion of more than two orbiting bodies in the Solar System had eluded mathematicians since Newton's time. This was known originally as the three-body problem, and later as the n-body problem, where n is any number of more than two orbiting bodies. The n-body solution was considered very important and challenging at the close of the 19th. century. Indeed, in 1887, in honour of his 60th. birthday, Oscar II, King of Sweden, established a prize for anyone who could find the solution to the problem. The announcement was quite specific:
'Given a system of mass points that attract each according to Newton's law, assuming that no two points ever collide, find a representation of the coordinates of each point as a series in a variable that is some known function of time and for all of whose values the series converges uniformly.'
In case the problem could not be solved, any other important contribution to classical mechanics would then be considered to be prize-worthy. The prize was finally awarded to Poincaré, even though he did not solve the original problem. One of the judges, the distinguished Karl Weierstrass, said:
"This work cannot indeed be considered as
furnishing the complete solution of the
question proposed, but it is nevertheless of
such importance that its publication will
inaugurate a new era in the history of celestial
mechanics."
Henri's contribution contained many important ideas which led to the theory of chaos. The problem as stated originally was finally solved by Karl F. Sundman for n = 3 in 1912, and was generalised to the case of n > 3 bodies by Qiudong Wang in the 1990's. The series solutions have very slow convergence. It would take millions of terms to determine the motion of the particles for even very short intervals of time, so they are unusable in numerical work.
-- Henri Poincaré's Work on Relativity
Poincaré's work at the Bureau des Longitudes on establishing international time zones led him to consider how clocks at rest on the Earth, which would be moving at different speeds relative to absolute space (aether), could be synchronised.
At the same time Dutch theorist Hendrik Lorentz was developing Maxwell's theory into a theory of the motion of charged particles ("electrons" or "ions"), and their interaction with radiation. In 1895 Lorentz had introduced an auxiliary quantity called "local time," and introduced the hypothesis of length contraction to explain the failure of optical and electrical experiments to detect motion relative to the aether.
Poincaré was a constant interpreter (and sometimes friendly critic) of Lorentz's theory. Poincaré as a philosopher was interested in the "deeper meaning". Thus he interpreted Lorentz's theory, and in so doing he came up with many insights that are now associated with special relativity.
In The Measure of Time (1898), Poincaré said:
"A little reflection is sufficient to understand
that all these affirmations have by themselves
no meaning. They can have one only as the
result of a convention."
He also argued that scientists have to set the constancy of the speed of light as a postulate to give physical theories the simplest form. Based on these assumptions he discussed in 1900 Lorentz's "wonderful invention" of local time, and remarked that it arose when moving clocks are synchronised by exchanging light signals assumed to travel with the same speed in both directions in a moving frame.
In 1892 Poincaré developed a mathematical theory of light including polarization. His vision of the action of polarizers and retarders, acting on a sphere representing polarized states, is called the Poincaré sphere. It was shown that the Poincaré sphere possesses an underlying Lorentzian symmetry, by which it can be used as a geometrical representation of Lorentz transformations and velocity additions.
Henri discussed the "principle of relative motion" in two papers in 1900, and named it the principle of relativity in 1904, according to which no physical experiment can discriminate between a state of uniform motion and a state of rest.
In 1905 Poincaré wrote to Lorentz about Lorentz's paper of 1904, which Poincaré described as:
"A paper of supreme importance".
In this letter he pointed out an error Lorentz had made when he had applied his transformation to one of Maxwell's equations i.e that for charge-occupied space. Henri also questioned the time dilation factor given by Lorentz.
In a second letter to Lorentz, Poincaré gave his own reason why Lorentz's time dilation factor was indeed correct after all—it was necessary to make the Lorentz transformation form a group—and he gave what is now known as the relativistic velocity-addition law.
-- The Mass–Energy Relation
Like others before, Poincaré (1900) discovered a relation between mass and electromagnetic energy. While studying the conflict between the action/reaction principle and the Lorentz ether theory, he tried to determine whether the center of gravity still moves with a uniform velocity when electromagnetic fields are included.
Henri noticed that the action/reaction principle does not hold for matter alone, but that the electromagnetic field has its own momentum. Poincaré concluded that the electromagnetic field energy of an electromagnetic wave behaves like a fictitious fluid with a mass density of E divided by c squared.
If the center of mass frame is defined by both the mass of matter and the mass of the fictitious fluid, and if the fictitious fluid is indestructible—it is neither created or destroyed—then the motion of the center of mass frame remains uniform.
However electromagnetic energy can be converted into other forms of energy, so Poincaré assumed that there exists a non-electric energy fluid at each point of space, into which electromagnetic energy can be transformed and which also carries a mass proportional to the energy.
In this way, the motion of the center of mass remains uniform. Poincaré said that one should not be too surprised by these assumptions, since they are only mathematical fictions.
However, Poincaré's resolution led to a paradox when changing frames: if a Hertzian oscillator radiates in a certain direction, it will suffer a recoil from the inertia of the fictitious fluid. Poincaré performed a Lorentz boost to the frame of the moving source.
He noted that energy conservation holds in both frames, but that the law of conservation of momentum is violated. This would allow perpetual motion, a notion which he abhorred. The laws of nature would have to be different in the frames of reference, and the relativity principle would not hold. Therefore, he argued that also in this case there has to be another compensating mechanism in the aether.
Poincaré himself came back to this topic in his St. Louis lecture (1904). He rejected the possibility that energy carries mass, and criticized his own solution to compensate the above-mentioned problems:
"The apparatus will recoil as if it were a cannon
and the projected energy a ball, and that
contradicts the principle of Newton, since our
present projectile has no mass; it is not matter,
it is energy.
Shall we say that the space which separates the
oscillator from the receiver and which the
disturbance must traverse in passing from one
to the other, is not empty, but is filled not only
with ether, but with air, or even in inter-planetary
space with some ethereal, yet ponderable fluid;
that this matter receives the shock, as does the
receiver, at the moment the energy reaches it,
and recoils, when the disturbance leaves it?
That would save Newton's principle, but it is not
true. If the energy during its propagation remained
always attached to some material substratum, this
matter would carry the light along with it, and
Fizeau has shown, at least for the air, that there is
nothing of the kind.
Michelson and Morley have since confirmed this.
We might also suppose that the motions of matter
proper were exactly compensated by those of the
aether; but that would lead us to the same
considerations as those made a moment ago.
The principle, if thus interpreted, could explain
anything, since whatever the visible motions, we
could imagine hypothetical motions to compensate
them.
But if it can explain anything, it will allow us to
foretell nothing; it will not allow us to choose
between the various possible hypotheses, since it
explains everything in advance.
It therefore becomes useless."
Henri refers to the Hertz assumption of total aether entrainment that was falsified by the Fizeau experiment, but that experiment does indeed show that that light is partially "carried along" with a substance.
Finally in 1908 Henri revisits the problem and ends with abandoning the principle of reaction altogether in favor of supporting a solution based in the inertia of aether itself.
Henri also discussed two other unexplained effects:
-- Non-conservation of mass implied by Kaufmann's experiments on the mass of fast moving electrons
-- The non-conservation of energy in the radium experiments of Marie Curie.
It was Albert Einstein's concept of mass–energy equivalence (1905) that a body losing energy as radiation or heat was losing mass of amount m = E/c2 that resolved Poincaré's paradox, without using any compensating mechanism within the ether.
The Hertzian oscillator loses mass in the emission process, and momentum is conserved in any frame. However, concerning Poincaré's solution of the Center of Gravity problem, Einstein noted that Poincaré's formulation and his own from 1906 were mathematically equivalent.
-- Gravitational Waves
In 1905 Poincaré first proposed gravitational waves emanating from a body and propagating at the speed of light. He wrote:
"It has become important to examine this hypothesis
more closely, and in particular to ask in what ways it
would require us to modify the laws of gravitation.
That is what I have tried to determine; at first I was
led to assume that the propagation of gravitation is
not instantaneous, but happens with the speed of
light."
-- Poincaré and Einstein
Einstein's first paper on relativity was published three months after Poincaré's short paper, but before Poincaré's longer version. Einstein relied on the principle of relativity to derive the Lorentz transformations, and used a similar clock synchronisation procedure (Einstein synchronisation) to the one that Poincaré (1900) had described, but Einstein's paper was remarkable in that it contained no references at all.
Poincaré never acknowledged Einstein's work on special relativity. However, Einstein expressed sympathy with Poincaré's outlook obliquely in a letter to Hans Vaihinger on the 3rd. May 1919, when Einstein considered Vaihinger's general outlook to be close to his own, and Poincaré's to be close to Vaihinger's.
In public, Einstein acknowledged Poincaré posthumously in the text of a lecture in 1921 titled "Geometry and Experience" in connection with non-Euclidean geometry, but not in connection with special relativity.
A few years before his death, Einstein commented on Poincaré as being one of the pioneers of relativity, saying:
"Lorentz had already recognized that the
transformation named after him is essential
for the analysis of Maxwell's equations, and
Poincaré deepened this insight still further."
-- Assessments of Poincaré and Relativity
Poincaré's work in the development of special relativity is well recognised, although most historians stress that despite many similarities with Einstein's work, the two had very different research agendas and interpretations of their work.
Poincaré developed a similar physical interpretation of local time and noticed the connection to signal velocity, but contrary to Einstein, he continued to use the aether concept in his papers, and argued that clocks at rest in the aether show the "true" time, and moving clocks show the local time.
So Poincaré tried to keep the relativity principle in accordance with classical concepts, while Einstein developed a mathematically equivalent kinematics based on the new physical concepts of the relativity of space and time.
While this is the view of most historians, a minority go much further, such as E. T. Whittaker, who held that Poincaré and Lorentz were the true discoverers of relativity.
-- Algebra and Number Theory
Poincaré introduced group theory to physics, and was the first to study the group of Lorentz transformations. He also made major contributions to the theory of discrete groups and their representations.
-- Topology
The subject is clearly defined by Felix Klein in his "Erlangen Program" (1872): the geometry invariants of arbitrary continuous transformation, a kind of geometry.
The term "topology" was introduced, as suggested by Johann Benedict Listing, instead of the previously used term "Analysis situs".
Some important concepts were introduced by Enrico Betti and Bernhard Riemann. But the foundation of this science, for a space of any dimension, was created by Poincaré. His first article on this topic appeared in 1894.
Henri's research in geometry led to the abstract topological definition of homotopy and homology. He also first introduced the basic concepts and invariants of combinatorial topology, such as Betti numbers and the fundamental group.
Poincaré proved a formula relating the number of edges, vertices and faces of an n-dimensional polyhedron (the Euler–Poincaré theorem), and gave the first precise formulation of the intuitive notion of dimension.
-- Astronomy and Celestial Mechanics
Poincaré published two classic monographs:
-- "New Methods of Celestial Mechanics" (1892–1899)
-- "Lectures on Celestial Mechanics" (1905–1910)
In them, he successfully applied the results of his research to the problem of the motion of three bodies, and studied in detail the behavior of solutions (frequency, stability, asymptotic, etc.). Poincaré introduced the small parameter method, fixed points, integral invariants, variational equations, the convergence of the asymptotic expansions.
Generalizing a theory of Bruns (1887), Poincaré showed that the three-body problem is not integrable. In other words, the general solution of the three-body problem can not be expressed in terms of algebraic and transcendental functions through unambiguous coordinates and velocities of the bodies. His work in this area was the first major achievement in celestial mechanics since Isaac Newton.
The two monographs include an idea of Poincaré, which later became the basis for mathematical "chaos theory" and the general theory of dynamical systems.
Poincaré authored important works on astronomy for the equilibrium figures of a gravitating rotating fluid. He introduced the important concept of bifurcation points, and proved the existence of equilibrium figures such as the non-ellipsoids, including ring-shaped and pear-shaped figures, and their stability.
For this discovery, Poincaré received the Gold Medal of the Royal Astronomical Society (1900).
-- Differential Equations and Mathematical Physics
After defending his doctoral thesis on the study of singular points of the system of differential equations, Poincaré wrote a series of memoirs under the title "On curves defined by differential equations" (1881–1882).
In these articles, he built a new branch of mathematics, called "qualitative theory of differential equations". Poincaré showed that even if the differential equation cannot be solved in terms of known functions, the very form of the equation provides a wealth of information about the properties and behavior of the solutions.
In particular, Poincaré investigated the nature of the trajectories of the integral curves in the plane, gave a classification of singular points (saddle, focus, center, node), introduced the concept of a limit cycle and loop index, and showed that the number of limit cycles is always finite, except for some special cases.
Poincaré also developed a general theory of integral invariants and solutions of the variational equations. For the finite-difference equations, he created a new direction – the asymptotic analysis of the solutions.
He applied all of these achievements to study practical problems of mathematical physics and celestial mechanics, and the methods used were the basis of its topological works.
Poincaré's work habits have been compared to a bee flying from flower to flower. Poincaré was interested in the way his mind worked; he studied his habits, and gave a talk about his observations in 1908 at the Institute of General Psychology in Paris. He linked his way of thinking to how he made several discoveries.
The mathematician Darboux claimed that Poincaré was un intuitif (an intuitive), arguing that this is demonstrated by the fact that he worked so often by visual representation.
Jacques Hadamard wrote that Poincaré's research demonstrated marvelous clarity, and Poincaré himself wrote that he believed that logic was not a way to invent, but a way to structure ideas, and that logic limits ideas.
The fact that renowned theoretical physicists like Poincaré, Boltzmann or Gibbs were not awarded the Nobel Prize is seen as evidence that the Nobel committee had more regard for experimentation than theory. In Poincaré's case, several of those who nominated him pointed out that the greatest problem was to name a specific discovery, invention, or technique.
-- Édouard Toulouse's Characterisation
Poincaré's mental organisation was interesting not only to Poincaré himself, but also to Édouard Toulouse, a psychologist based in Paris. Toulouse wrote a book entitled Henri Poincaré (1910). In it, he discussed Poincaré's regular schedule:
"He worked during the same times each day in
short periods of time. He undertook mathematical
research for four hours a day, between 10 a.m. and
noon, then again from 5 p.m. to 7 p.m.. He would
read articles in journals later in the evening.
His normal work habit was to solve a problem
completely in his head, then commit the completed
problem to paper.
He was ambidextrous and nearsighted.
His ability to visualise what he heard proved
particularly useful when he attended lectures, since
his eyesight was so poor that he could not see
properly what the lecturer wrote on the blackboard.
These abilities were offset to some extent by his shortcomings:
-- He was physically clumsy and artistically inept.
-- He was always in a rush, and disliked going back
for changes or corrections.
-- He never spent a long time on a problem since
he believed that the subconscious would continue
working on the problem while he consciously
worked on another problem.
In addition, Toulouse stated that most mathematicians worked from principles already established, while Poincaré started from basic principles each time (O'Connor et al., 2002).
His method of thinking is well summarised as:
"Accustomed to neglecting details and to looking
only at mountain tops, he went from one peak to
another with surprising rapidity, and the facts he
discovered, clustering around their center, were
instantly and automatically pigeonholed in his
memory."
— Belliver (1956).
-- Philosophy
Poincaré had philosophical views opposite to those of Bertrand Russell and Gottlob Frege, who believed that mathematics was a branch of logic. Poincaré strongly disagreed, claiming that intuition was the life of mathematics. Poincaré gives an interesting point of view in his 1902 book Science and Hypothesis:
"For a superficial observer, scientific truth is beyond
the possibility of doubt; the logic of science is infallible,
and if the scientists are sometimes mistaken, this is
only from their mistaking its rule."
Poincaré believed that arithmetic is synthetic. He argued that Peano's axioms cannot be proven non-circularly with the principle of induction (Murzi, 1998), therefore concluding that arithmetic is a priori synthetic and not analytic.
Poincaré then went on to say that mathematics cannot be deduced from logic since it is not analytic. His views were similar to those of Immanuel Kant. He strongly opposed Cantorian set theory, objecting to its use of impredicative definitions.
However, Poincaré did not share Kantian views in all branches of philosophy and mathematics. For example, in geometry, Poincaré believed that the structure of non-Euclidean space can be known analytically.
Poincaré held that convention plays an important role in physics. His view (and some later, more extreme versions of it) came to be known as "conventionalism". Poincaré believed that Newton's first law was not empirical, but is a conventional framework assumption for mechanics.
He also believed that the geometry of physical space is conventional. He considered examples in which either the geometry of the physical fields or gradients of temperature can be changed, either describing a space as non-Euclidean measured by rigid rulers, or as a Euclidean space where the rulers are expanded or shrunk by a variable heat distribution.
However, Poincaré thought that we were so accustomed to Euclidean geometry that we would prefer to change the physical laws to save Euclidean geometry rather than shift to a non-Euclidean physical geometry.
-- Free Will
Poincaré's famous lectures before the Société de Psychologie in Paris were cited by Jacques Hadamard as the source for the idea that creativity and invention consist of two mental stages, first random combinations of possible solutions to a problem, followed by a critical evaluation.
Although he most often spoke of a deterministic universe, Poincaré said that the subconscious generation of new possibilities involves chance.
"It is certain that the combinations which present
themselves to the mind in a kind of sudden
illumination after a somewhat prolonged period of unconscious work are generally useful and fruitful combinations... all the combinations are formed as
a result of the automatic action of the subliminal
ego, but those only which are interesting find their
way into the field of consciousness.
A few only are harmonious, and consequently at
once useful and beautiful, and they will be capable
of affecting the geometrician's special sensibility
I have been speaking of; which, once aroused, will
direct our attention upon them, and will thus give
them the opportunity of becoming conscious.
In the subliminal ego, on the contrary, there reigns
what I would call liberty, if one could give this name
to the mere absence of discipline and to disorder
born of chance."
Poincaré's two stages—random combinations followed by selection—became the basis for Daniel Dennett's two-stage model of free will.
Fangruida -- Modern Science and Technology Engineering and Comprehensive High-end Technology R&D, Design and Manufacturing (Introduction to Modern Science and Engineering Technology Research)
2013v2.3 2021v.2.5 Online global version, mobile version (Bick compiled in November 2021. Colombia)
♣♣♣♣Moon Comprehensive Deep Development♥♥♣Ocean City, Marine Architecture, ♣♣Desert City, ♥♥♥ Mountain City, ♦♦♦Life Genetic Engineering, ♦♦♦♦Green Plant Nutrition Engineering●●●●●●● Smart Engineering; ♦♦♦♦♦♦ Nuclear Engineering - Peaceful Use of Nuclear Energy
●●●●●●Advanced Manufacturing●●●●●●●
--New World Intelligence Revolution, New Industrial Revolution, New Planetary Revolution, New Moon Revolution, New Cosmic Revolution
**************************************************** ****************************************
Architecture Bridge design, large-scale circuit design (chip development, etc.), mechanical and electrical product design and manufacturing, pharmaceutical product development and design, genetic engineering, aerospace technology design and manufacturing, atomic energy development and utilization, agricultural engineering, computer-aided design and manufacturing,
New material research and development design, military
Engineering design and manufacturing, industrial robots, aircraft and ships, missiles, spacecraft, spaceships, rockets, submarines, super-speed missiles, etc. are very important, and the foresight is highly integrated. the key. These science and technology are the powerful driving force of historical development, and also the key to whether each country can reach the peak of the world.
The rapid development of modern science, all kinds of soft design emerge in an endless stream. Mathematical software, civil software, mechanical software, electrical and electronic software, chemical software, aircraft software, ship software, missile software, spacecraft software, rocket software, material software, bionic simulation software, medical software, chemical software, etc. Their appearance and wide application are of great significance to industrial modernization and intelligence, which greatly improves artificial intelligence and greatly promotes the rapid development of human society. Marine engineering, overall lunar development engineering, intelligent highly integrated engineering, high-speed heavy-duty fire
Arrow transportation engineering, submarine tunnel engineering, reservoir dam engineering, agricultural engineering, biomedical engineering and so on. Lunar overall engineering development planning, Mars engineering development and design, desert engineering (desert city), alpine city, marine engineering (ocean city) life genetic engineering, green plant nutrition engineering, VLSI design and manufacturing, Daxing civil engineering hydraulic engineering, road and bridge , tunnels, super tall buildings, all of them.
The modern scientific revolution is guided by the revolution in physics, with the emergence of modern cosmology, molecular biology, systems science, and soft science as its important content, and is characterized by the interpenetration of natural science, social science and thinking science to form interdisciplinary subjects. scientific revolution.
In the past 30 years, emerging technologies such as computers, energy, new materials, space, and biology have emerged successively, causing the third scientific and technological revolution. The third technological revolution far exceeds the previous two in terms of scale, depth and impact.
Basic Features:
1. Greatly promoted the development of social productive forces—changes in the means to improve labor productivity;
2. Promoting changes in the social and economic structure and social life structure - the proportion of the tertiary industry has increased. Changes in people's daily life such as food, clothing, housing and transportation;
3. It has promoted the adjustment of the international economic structure - localities are more closely connected.
4. Planetary revolution, lunar revolution. Lunar engineering Lunar industrial intelligent city Lunar-Earth round-trip communication system
We should develop the moon fast, it's a real cornering overtake. The physical presence of the moon will be of great strategic importance for thousands of years to come. There are many resources on a first-come, first-served basis, orbits, best lunar locations, electromagnetic wave bands, etc.
Make full use of the local resources and environment of the moon to quickly build a city. Minimize the amount of supplies and equipment that needs to be launched to the Moon.
5. Ocean City, Ocean Building, ♣♣ Desert City, ♥♥♥ Mountain City
6. Life genetic engineering, drug research and development
7 Green Plant Nutrition Engineering
8 Smart Engineering
9 Nuclear Engineering
10 Advanced Manufacturing Engineering
The rapid development of modern science and technology, with each passing day, all kinds of inventions and creations, all kinds of technological innovations are numerous. However, the most important and most relevant technical fields mainly include lunar engineering, lunar industrial intelligent city, lunar-earth round-trip communication system,
Radius: 1737 km; Ocean City, Ocean Building, ♣♣ Desert City, ♥♥♥ Mountain City
6. Life genetic engineering, drug research and development
7 Green Plant Nutrition Engineering
8 Smart Engineering
9 Nuclear Engineering
10 Advanced Manufacturing Engineering and others. It is in these fields and categories that the development competition among countries is nothing more than. Of course, military, aerospace, etc. are also among them.
Scientific discoveries can last for thousands of years, and technological inventions can be kept fresh for only a few decades, and they will be obsolete in a few hundred years. Such as electronic product updates, quite quickly. Life cycles are short, as are smart cars, smartphones, etc. Of course, the technological limit may also reach hundreds of years. Even scientific discoveries are not permanent. Tens of thousands of years later, people will have a new leap in understanding the universe and natural laws of natural phenomena. For example, people are on the moon and on Mars, and the human wisdom finds that the invention of wisdom is unbelievable. For us people on earth, we have become uncivilized ancient human beings. The intelligence quotient of lunar humans is dozens and hundreds of times that of our current Earth humans. The scientific discovery of that time was unimaginable. Mathematical, physical and chemical, natural, agricultural, medical, industrial, legal and commercial, literature, history, philosophy, classics, education, etc., everything will be renovated and mutated.
math
The science of studying quantitative relationships and spatial forms in the real world. It is produced and developed in the long-term practical activities of human beings. Originated from counting and measurement, with the development of productive forces, more and more quantitative research on natural phenomena is required; at the same time, due to the development of mathematics itself, it has a high degree of abstraction, rigorous logic and wide applicability. It is roughly divided into two categories: basic mathematics (also known as pure mathematics) and applied mathematics. The former includes branches such as mathematical logic, number theory, algebra, geometry, topology, function theory, functional analysis and differential equations; the latter includes branches such as probability theory, mathematical statistics, computational mathematics, operations research and combinatorial mathematics
■■■Basic technical sciences, mainly including civil engineering, electromechanical engineering, chemical engineering, information engineering, aerospace engineering, ocean engineering, mining engineering, medical engineering, materials engineering, computational engineering, agricultural engineering, energy engineering, lunar engineering, Mars engineering , life engineering and so on.
. Computational mathematics and its application software This major trains students to master the basic theories, basic knowledge and basic methods of mathematical science, to have the ability to apply mathematical knowledge and use computers to solve practical problems, and to be able to engage in research, teaching or production in the departments of science and technology, education and economics Senior talents engaged in practical application and management in operation and management departments. This major in computer software is to cultivate all-round development of morality, intelligence, physique, beauty, labor, etc., master certain professional theoretical knowledge, basic knowledge and basic skills of computer programming and application, and be proficient in using the latest international popular software development environment and tools. , Familiar with international software development norms, have strong software development practice ability and good software engineering literacy.
Modern mathematics is a edifice built from a series of abstract structures. It is based on the innate belief of human beings in the inevitability and accuracy of mathematical reasoning, and it is the concentrated expression of confidence in the capacity, origin and power of human reason. Deductive reasoning based on self-evident axioms is absolutely reliable, that is, if an axiom is true, then the conclusions deduced from it must also be true. By applying these seemingly clear, correct, and perfect logics, mathematicians The conclusions reached are clearly unquestionable and irrefutable. Naturally, mathematics is constantly developing and alienating, and eternal mathematics is also unrealistic, mainly due to the changes in the logical thinking structure of the human brain, and mathematics will continue to mutate or alienate. Mathematical logic, natural logic, image logic, hybrid compound logic.
In fact, the above-mentioned understanding of the essential characteristics of mathematics is carried out from the aspects of the source, the way of existence, and the level of abstraction of mathematics, and the essential characteristics of mathematics are mainly seen from the results of mathematical research. Common general-purpose mathematical software packages include: Matlab, Mathematica and Maple, where Matlab is good at numerical calculation, while Mathematica and Maple are good at symbolic operation and formula derivation
(2) Dedicated math packages include:
Drawing software: MathCAD, Tecplot, IDL, Surfer, Origin, SmartDraw, DSP2000
Numerical computing class: Matcom, DataFit, S-Spline, Lindo, Lingo, O-Matrix, Scilab, Octave
Numerical calculation library: linpack/lapack/BLAS/GERMS/IMSL/CXML
Finite element calculation classes: ANSYS, MARC, PARSTRAN, FLUENT, FEMLAB, FlexPDE, Algor, COSMOS, ABAQUS, ADINA
Mathematical statistics: GAUSS, SPSS, SAS, Splus
Obviously, the result (as a deductive system of the theory) does not reflect the whole picture of mathematics, another very important aspect that constitutes the whole of mathematics is the process of mathematical research, and in general, mathematics is a dynamic process, a " The experimental process of thinking" is the abstract generalization process of mathematical truth. The logical deductive system is a natural result of this process. In the process of mathematical research, the richness of mathematical objects, the invention of mathematics by human beings, "Mathematics is a language", mathematical activities are social, it is in the historical process of the development of human civilization, human beings understand nature, adapt to It is the crystallization of a high degree of wisdom that transforms nature and improves self and society. Mathematics has a key influence on the way of thinking of human beings. It is of great significance. Mathematics, physics and chemistry, mathematics is the first priority, and it is not an exaggeration.
Based on the above understanding of the essential characteristics of mathematics, people also discussed the specific characteristics of mathematics from different aspects. The more general view is that mathematics has the characteristics of abstraction, precision and extensive application, among which the most essential characteristic is abstraction. In addition, from the perspective of the process of mathematical research and the relationship between mathematics and other disciplines, mathematics also has imagery, plausibility, and quasi-experience. The "falsifiability" feature of Matlab is suitable for the engineering world, especially toolboxes, fast code, and many integrations with third-party software, such as optimization toolboxes
The most obvious third party is comsol
Mathematica syntax is excellent, so good that it comes with almost all programming paradigms
. The understanding of the characteristics of mathematics is also characteristic of the times. For example, regarding the rigor of mathematics, there are different standards in each period of mathematics historical development, from Euclidean geometry to Lobachevsky geometry to the Hilbert axiom system. , the evaluation criteria for rigor vary widely, especially when Gödel proposed and proved the "incompleteness theorem... Later, it was found that even axiomatic, a rigorous scientific method that was once highly regarded, was flawed. Therefore, the rigor of mathematics is shown in the history of mathematics development and has a relativity. Regarding the plausibility of mathematics,
◆◆◆ Mathematics is the tool and means of physical research. Some research methods of physics have strong mathematical ideas, so the process of learning physics can also improve mathematical cognition. Mathematical logic is the study of symbolic and mathematical logic in formal logic.