View allAll Photos Tagged Prediction

TMax 400, Vivitar 70-150CF f/3.8 on AE-1. HC-110, 1:160, 60 min @ 15C semi stand

cliché prediction:

New Years colorful kisses -

hope warm glow endures.

©2011hjwizell

Gordon E. Moore, the Intel Co-Founder Behind Moore’s Law, Dies at 94

His prediction in the 1960s about exponential advances in computer chip technology charted a course for the age of high tech.

 

Image

Gordon E. Moore in 1990 at the Silicon Valley headquarters of Intel, which he founded in 1968 with Robert Noyce.

Gordon E. Moore in 1990 at the Silicon Valley headquarters of Intel, which he founded in 1968 with Robert Noyce. Credit...Alamy

By Holcomb B. Noble and Katie Hafner

March 24, 2023, 9:36 p.m. ET

6 MIN READ

Gordon E. Moore, a co-founder and former chairman of Intel Corporation, the California semiconductor chip maker that helped give Silicon Valley its name, achieving the kind of industrial dominance once held by the giant American railroad or steel companies of another age, died on Friday at his home in Hawaii. He was 94.

 

His death was confirmed by Intel and the Gordon and Betty Moore Foundation. They did not provide a cause.

 

Along with a handful of colleagues, Mr. Moore could claim credit for bringing laptop computers to hundreds of millions of people and embedding microprocessors into everything from bathroom scales, toasters and toy fire engines to cellphones, cars and jets.

 

Mr. Moore, who had wanted to be a teacher but could not get a job in education and later called himself the Accidental Entrepreneur, became a billionaire as a result of an initial $500 investment in the fledgling microchip business, which turned electronics into one of the world’s largest industries.

 

Story continues below advertisement

 

And it was he, his colleagues said, who saw the future. In 1965, in what became known as Moore’s Law, he predicted that the number of transistors that could be placed on a silicon chip would double at regular intervals for the foreseeable future, thus increasing the data-processing power of computers exponentially.

 

He added two corollaries later: The evolving technology would make computers more and more expensive to build, yet consumers would be charged less and less for them because so many would be sold. Moore’s Law held up for decades.

 

Through a combination of Mr. Moore’s brilliance, leadership, charisma and contacts, as well as that of his partner and Intel co-founder, Robert Noyce, the two assembled a group widely regarded by many as among the boldest and most creative technicians of the high-tech age.

 

This was the group that advocated the use of the thumbnail-thin chips of silicon, a highly polished, chemically treated sandy substance — one of the most common natural resources on earth — because of what turned out to be silicon’s amazing hospitality in housing smaller and smaller electronic circuitry that could work at higher and higher speeds.

 

Story continues below advertisement

 

With its silicon microprocessors, the brains of a computer, Intel enabled American manufacturers in the mid-1980s to regain the lead in the vast computer data-processing field from their formidable Japanese competitors. By the ’90s, Intel had placed its microprocessors in 80 percent of the computers that were being made worldwide, becoming the most successful semiconductor company in history.

 

Much of his happened under Mr. Moore’s watch. He was chief executive from 1975 to 1987, when Andrew Grove succeeded him, and remained as chairman until 1997.

 

As his wealth grew, Mr. Moore also became a major figure in philanthropy. In 2001, he and his wife created the Gordon and Betty Moore Foundation with a donation of 175 million Intel shares. In 2001, they donated $600 million to the California Institute of Technology, the largest single gift to an institution of higher learning at the time. The foundation’s assets currently exceed $8 billion and it has given away more than $5 billion since its founding.

 

In interviews, Mr. Moore was characteristically humble about his achievements, particularly the technical advances that Moore’s Law made possible.

 

Story continues below advertisement

 

“What I could see was that semiconductor devices were the way electronics were going to become cheap. That was the message I was trying to get across,” he told the journalist Michael Malone in 2000. “It turned out to be an amazingly precise prediction — a lot more precise than I ever imagined it would be.”

 

Not only was Mr. Moore predicting that electronics would become much cheaper over time, as the industry shifted from away from discrete transistors and tubes to silicon microchips, but over the years his prediction proved so reliable that technology firms based their product strategy on the assumption that Moore’s Law would hold.

 

“Any business doing rational multiyear planning had to assume this rate of change or else get steamrolled,” said Harry Saal, a longtime Silicon Valley entrepreneur.

 

“That’s his legacy,” said Arthur Rock, an early investor in Intel and friend of Mr. Moore’s. “It’s not Intel. It’s not the Moore Foundation. It’s that phrase: Moore’s Law.”

 

Image

Mr. Moore during Intel’s early days. His prediction, a few years earlier, that the number of transistors that could be placed on a silicon chip would double at regular intervals became known as Moore’s Law.

Mr. Moore during Intel’s early days. His prediction, a few years earlier, that the number of transistors that could be placed on a silicon chip would double at regular intervals became known as Moore’s Law.Credit...Intel

Gordon Earl Moore was born on Jan. 3, 1929, in San Francisco. He grew up in Pescadero, a small coastal town south of San Francisco, where his father, Walter H. Moore, was deputy sheriff and the family of his mother, the former Florence Almira Williamson, ran the general store.

 

Mr. Moore enrolled at San Jose State College (now San José State University), where he met Betty Whitaker, a journalism student. They married in 1950. That year, he completed his undergraduate studies at the University of California, Berkeley, with a degree in chemistry. In 1954, he received his doctorate, also in chemistry, from the Caltech.

 

One of the first jobs he applied for was as a manager with Dow Chemical. “They sent me to a psychologist to see how this would fit,” Mr. Moore wrote in 1994. “The psychologist said I was OK technically but I’d never manage anything.”

 

So Mr. Moore took a position with the Applied Physics Laboratory at Johns Hopkins University in Maryland. Then, looking for a way back to California, he interviewed at Lawrence Livermore Laboratory in Livermore, Calif. He was offered a job, “but I decided I didn’t want to take spectra of exploding nuclear bombs, so I turned it down,” he wrote.

 

Instead, in 1956, Mr. Moore joined William Shockley, the inventor of the transistor, to work at a West Coast division of Bell Laboratories, a start-up unit whose aim was to make a cheap silicon transistor.

 

But the company, Shockley Semiconductor, foundered under Mr. Shockley, who had no experience running a company. In 1957, Mr. Moore and Mr. Noyce joined a group of defectors who came to be known as “the traitorous eight.” With each putting in $500, along with $1.3 million in backing from the aircraft pioneer Sherman Fairchild, the eight men left to form the Fairchild Semiconductor Corporation, which became a pioneer in manufacturing integrated circuits.

 

Bitten by the entrepreneurial bug, Mr. Moore and Mr. Noyce decided in 1968 to form their own company, focusing on semiconductor memory. They wrote what Mr. Moore described as a “very general” business plan.

 

“It said we were going to work with silicon … and make interesting products,” he said in an interview in 1994.

 

Their vague proposal notwithstanding, they had no trouble finding financial backing.

 

With $2.5 million in capital, Mr. Moore and Mr. Noyce called their start-up Integrated Electronics Corporation, and later shortened it to Intel. The third employee was Mr. Grove, a young Hungarian immigrant who had worked under Mr. Moore at Fairchild.

 

After some indecision around what technology to focus on, the three men settled on a new version of MOS — metal oxide semiconductor — technology called silicon-gate MOS. To improve a transistor’s speed and density, they used silicon instead of aluminum.

 

“Fortunately, very much by luck, we had hit on a technology that had just the right degree of difficulty for a successful start-up,” Mr. Moore wrote in 1994. “This was how Intel began.”

 

In the early 1970s, Intel’s 4000 series “computer on a chip” began the revolution in personal computers, although Intel itself missed the opportunity to manufacture a PC, which Mr. Moore blamed partly on his own shortsightedness.

 

“Long before Apple, one of our engineers came to me with the suggestion that Intel ought to build a computer for the home,” he wrote. “And I asked him, ‘What the heck would anyone want a computer for in his home?”

 

Image

Mr. Moore holding a silicon wafer in 2005. Silicon was a key to Intel’s success.

Mr. Moore holding a silicon wafer in 2005. Silicon was a key to Intel’s success.Credit...Paul Sakuma/Associated Press

Still, he saw the future. In 1963, while still at Fairchild as director of research and development, Mr. Moore contributed a book chapter describing what was to become the precursor to his eponymous law, without the explicit numerical prediction. Two years later, he published an article in Electronics, a widely circulated trade magazine, titled, “Cramming More Components Onto Integrated Circuits.”

 

“The article presented the same argument as the book chapter, with the addition of this explicitly numerical prediction,” said David Brock, a co-author of “Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary.”

 

There is little evidence that many people read the article when it was published, Mr. Brock said.

 

“He kept giving talks with these charts and plots, and people started using his slides and reproducing his graphs,” Mr. Brock said. “Then people saw the phenomenon happen. Silicon microchips got more complex, and their cost went down.”

 

In the 1960s, when Mr. Moore began in electronics, a single silicon transistor sold for $150. Later, $10 would buy more than 100 million transistors. Mr. Moore once wrote that if cars advanced as quickly as computers, “they would get 100,000 miles to the gallon and it would be cheaper to buy a Rolls-Royce than park it. (Cars would also be a half an inch long.)”

 

Mr. Moore’s survivors include his wife, and his sons Kenneth and Steven, as well as four grandchildren.

 

In 2014, Forbes estimated Mr. Moore’s net worth at $7 billion. Yet he remained unprepossessing throughout his life, preferring tattered shirts and khakis to tailored suits. He shopped at Costco and kept a collection of fly lures and fishing reels on his office desk.

 

Moore’s Law is bound to reach its end, as engineers encounter some basic physical limits, as well as the extreme cost of building factories to achieve the next level of miniaturization. And in recent years, the pace of miniaturization has slowed.

 

Mr. Moore himself commented from time to time on the inevitable end to Moore’s Law. “It can’t continue forever,” he said in a 2005 interview with Techworld magazine. “The nature of exponentials is that you push them out and eventually disaster happens.”

 

Holcomb B. Noble, a former science editor for The Times, died in 2017.

Katie Hafner, a former staff reporter for The New York Times, is a co-author of "Where Wizards Stay Up Late: The Origins of The Internet."

 

How The Times decides who gets an obituary. If you made news in life, chances are your death is news, too. There is no formula, scoring system or checklist. We investigate, research and ask around before settling on our subjects.

Learn more about our process.

Recommended Newsletters

 

times journeys special offer

Times Journeys

(I'm sorry for not updating my photostream for a while. I almost spent a week thinking of an excuse, lol.)

 

Yeah, Spain won against Germany! :D WOOHOOO!

Have you heard of Paul the Octopus? I heard it was the one which can predict the results for the World Cup! Last time, Paul predicted that Spain will win... and he was right. Sounds crazy, I know, but try to find it on YouTube!

Nostradamus was born in Saint Rémy de Provence, southern France, on December 14th of 1503, and he was a famous French physician, cabalist, and pharmacist, best known for his book Les Prophéties, its first edition published in 1555. Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece www.yearly-horoscope.org/nostradamus-2021-predictions. Nostradamus’ prophecies are expressed in verses, called quatrains. Many of his predictions, such as the rise to power of Adolf Hitler, and the Second World War, turned out to be accurate. Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece www.yearly-horoscope.org/nostradamus-2021-predictions

 

Nostradamus had written 6338 prophecies, many of them fulfilled. His prophecies cover a period reaching the year 3797. The secret of his predictions is not known. Nostradamus’ quatrains continue to fascinate the world, although they were written almost five centuries ago. Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece www.yearly-horoscope.org/nostradamus-2021-predictions..

 

Here are some of Nostradamus’ predictions for 2021: 1. Zombie Apocalypse A Russian scientist will create a biological weapon and produce a virus that can turn humankind into zombies, and we will all be extinct in the near future. “Few young people: half−dead to give a start. Dead through spite, he will cause the others to shine, And in an exalted place some great evils to occur: Sad concepts will come to harm each one, Temporal dignified, the Mass to succeed. Fathers and mothers dead of infinite sorrows, Women in mourning, the pestilent she−monster: The Great One to be no more, all the world to end.” 2. A Famine of Biblical Proportions Nostradamus predicted that the first signs of the end of the world would be famine, earthquakes, different illnesses, and epidemics, which are already happening more frequently. The Coronavirus pandemic from 2020 represents the beginning of a series of unfavorable events, which will affect the world’s population. The famine that lurks is one the world had never faced before. A catastrophe of huge proportions will throw us back in history, and a great part of the world population will not be able to overcome this curse. “After great trouble for humanity, a greater one is prepared, The Great Mover renews the ages: Rain, blood, milk, famine, steel, and plague, Is the heavens fire seen, a long spark running.” Starting from 2020, after 248 years, Saturn in Capricorn united its forces with Pluto, which is also in Capricorn, in the remarkable conjunction that will change the fate of the world. Saturn in Capricorn is responsible for social hierarchies, state power, authority, functions, and status, and this is what the conjunction with Pluto, the planet of death, destruction, and reconstruction triggered. 3. Solar Storms 2021 will be quite a significant year in terms of major global events. Great solar storms will take place, which could cause some major damages to the earth. Nostradamus supposedly warned: „We shall see the water rising and the earth falling under it”. The harmful effects of the climatic changes will then lead to many wars and conflicts, as the world will fight over resources, and mass migration will follow. 4. A Comet will hit the Earth or it will come very close to Terra This event will cause earthquakes and other natural disasters, which can be concluded from the quatrain: “In the sky, one sees fire and a long trail of sparks”. Other interpretations of this quatrain assert that it refers to a great asteroid that will hit the Earth. Once it enters the Earth’s atmosphere, the asteroid will heat up, appearing in the sky like a great fire. NASA announced that a huge asteroid is likely to hit the Earth in the next years after the American agency emits alerts daily, only this time, it is something more serious. An asteroid called 2009 KF1 has chances to hit the Earth on May 6th of 2021, the NASA coming to this conclusion, following analyzes regarding its trajectory. NASA claims that this asteroid has the power to hit the Earth with the equivalent of 230 kilotons of TNT explosive force, which means 15 times more than the nuclear bomb detonated by Americans over Hiroshima in 1945. 5. A Devastating Earthquake Will Destroy California According to the interpretation of a quatrain written by Nostradamus, an extremely powerful seism will destroy California in 2021. Nostradamus predicts that a great earthquake will hit the New World (“the western lands”), and California is the logical place where it might happen. According to the astrologers, “Mercury in Sagittarius, Saturn fading”, the following date when the planets Mars and Saturn will be in this position on the sky will be on November 25th of 2021. Nostradamus’ quatrain: “The sloping park, great calamity, Through the Lands of the West and Lombardy The fire in the ship, plague, and captivity; Mercury in Sagittarius, Saturn fading.” 6. The American Soldiers Will Have Brain Chip Implants The American soldiers will be turned into a kind of cyborgs, at least at the brain level, to save the human race. This chip should offer us the necessary digital intelligence to progress beyond the limits of biological intelligence. This could mean that we will incorporate artificial intelligence into our bodies and brains. The newly made one will lead the army, Almost cut off up to near the bank: Help from the Milanais elite straining, The Duke deprived of his eyes in Milan in an iron cage.

 

Conclusions: “A prophet is properly speaking one, who sees distant things through a natural knowledge of all creatures. And it can happen that the prophet bringing about the perfect light of prophecy may make manifest things both human and divine.” (Nostradamus in a letter to his son, Cesar) Nostradamus’ quatrains include many disturbing predictions. Based both on the knowledge of Nostradamus as a human being, and the dangerous era he lived, the eight prophecies of 2021 reveal fragments of what the alchemist predicted for our world.

 

Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece www.yearly-horoscope.org/nostradamus-2021-predictions/

 

By watching aspects such as climate changes, technological evolution, or inadequate governmental decisions, we might think that everything is against us, and humankind is on the verge of extinction. Here are just a few of Nostradamus’ predictions, outlining the idea of a terrifying future, far from what we would have imagined.

 

Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece wisehoroscope.org/nostradamus-2021/.ficial Intelligence – The Robots will rule the world From 2021, artificial intelligence will be equal or will even surpass human intelligence, which could lead to an apocalyptic scenario like those we see in movies. “The Moon in the full of night over the high mountain, The new sage with a lone brain sees it: By his disciples invited to be immortal, Eyes to the south. Hands in bosoms, bodies in the fire.” Mechanization Most previsions indicate that until 2023, the labor market will crash. The automated machines will replace the people in the work process, since they don’t demand higher salaries, and they don’t need breaks or other benefits. When employers choose robots instead of humans, the whole social model will crush. Unemployment, social disorders, and misery are just a few of the consequences of mechanization. A War between Two Allied Countries Two allied countries will get into a classic and open military conflict, with naval fleet fully engaged against each other. Unfortunately, this will only be the beginning because this conflict between these two seemingly “friendly” countries will degenerate into a global war, which will involve the most powerful countries in the world. “In the city of God there will be a great thunder Two brothers torn apart by Chaos while the fortress endures The great leader will succumb The third big war will begin when the big city is burning”. The Economic Collapse of 2021 Hundreds of closed hedge funds will go bankrupt, and the international exchange market will need to close in a short time – maybe even for a week, to stop the panic of selling shares, that will slowly envelop the stock markets. Nostradamus also predicted the 2008 crisis, and in 2021, things are not great: the exchange markets are in free fall and have reached panic levels the end of the crisis is still out of sight the United States is facing an economic stagnation for several years the economic fundamentals are no longer applying today. Space Flight will be Accessible to Common People In 2021, it will start a new era of space tourism. Common people will fly into space, being able to admire spectacular views of the Earth. Sea Level Rise There is no doubt: the climate is constantly changing. Out of all the apocalyptic scenarios which might come true, the sea level rise is the most dramatic. Presently, 50% of the world population lives in coastal areas. A prediction of a possible disaster can be seen by following the example of Newton City from Alaska, which, in less than five years, will be swallowed up by rising waters. The small community of 400 people will not be the only one affected. The current estimations suggest that Venice will be uninhabitable by 2100, and Los Angeles and Amsterdam will be abandoned five years later. “Peace and plenty for a long time the place will praise: Throughout his realm the fleur-de-lis deserted: Bodies dead by water, land one will bring there, Vainly awaiting the good fortune to be buried there.” Solar Eruptions The solar activity has been and still is a subject of interest for experts and not only. Because the sun will reach, in 2021, the peak of an 11-years cycle, known as the maximum solar, different theoreticians rushed to speculate in this regard. It is a fact that when a maximum solar occurs, much more intense solar explosions take place than the previous ones, but this does not translate into total chaos or natural catastrophe. Solar eruptions don’t just happen from yesterday but occur at regular intervals, and the event taking place in 2021 could mostly announce satellite communication interruptions. “Condom and Auch and around Mirande, I see fire from the sky which encompasses them. Sun and Mars conjoined in Leo, then at Marmande, lightning, great hail, a wall falls into the Garonne.” How many predictions did Nostradamus get right? A part of Nostradamus’s prophecies, the famous French doctor and alchemist from the 16th century, have come true. Nostradamus predicted the beginning of the Second World War, Hitler’s ascension, the fall of communism, President J. F. Kennedy’s assassination, India’s independence and the occurrence of Israel State on the world map, events confirmed by the passing of time, but also occurrences that go further in time. Read also: Horoscope 2021 for every zodiac sign. Things You Might Not Know About Nostradamus Nostradamus is certainly one of the most illustrious personalities in history. This notoriety is due to his famous prophecies and predictions. Beyond astrology, Nostradamus was a talented doctor, but also a controversial character, specialized in occultism. Everyone has heard of Nostradamus. He was a medieval character, renowned for his capacity to predict, through scientific methods, the events that will happen in the distant future. Nostradamus predictions, written around 500 years ago, are still going around the world today, and the French man is one of the most important figures of occult art. Besides astrology and his predictions regarding the future, Nostradamus had an adventurous love life, marked by long journeys, extrasensory experiences, the run from Inquisition, but also by an exceptional, yet unjustly less-mentioned medical career. Nostradamus aroused admiration, but also envy. Moreover, he was a controversial character. Nostradamus confessed that his predictions have a scientific fundament. He claimed that he managed to predict the future by calculating the position of the stars and planets towards the earth and other astral bodies. How Nostradamus died Nostradamus even predicted his death: ‘Next to the bench and bed, I will be found dead’. After he announced, one evening, that he will not survive the night, he died of a gout episode and was found dead the following morning, next to his worktable

 

Feel free to publish a summary of this article (in English or translated into another language) along with a link to the full piece wisehoroscope.org/nostradamus-2021/

Prediction said a "dusting" of snow. We got a foot!!

Henry Vernon almost fainted when Madame Irma made ​​a so terrible prediction.

Modified vignette to fit the rules of this contest: ( brickfanatics.co.uk/cmf-series-9-diorama-building-contest/ )

The prediction: This is going to be an interesting year (4 years?)! Must refrain from walking under ladders, too!

 

Also foreseen, a large order of this new Instax Monochrome film. Fun stuff!

 

From my ongoing series: Homebound pinholes.

 

Le Bambole Mk. X - "The Pin-sta-nair" Pinhole Camera.

Fujifilm Instax Mini Monochrome Film.

Two aspects of a March morning and a promise of rain by lunchtime. First rays to warm the soul.

Prediction is very difficult, especially if it's about the future. - Niels Bohr

 

I think this would make a great saying for a fortune cookie.

May good things come your way, and sooner than you think.

 

It's that time of the year again where The Oscars is this Sunday! I've seen a lot of the nominated films this year, so I feel pretty confident in my predictions. So here are my predictions for best director, best score, best animated film, best visual effects, best adapted screenplay, and best original screenplay.

 

Best Director: Guillermo del Toro (The Shape of Water)

 

Best Score: Dunkirk

 

Best Animated Film: Coco

 

Best Visual Effects: War for the Planet of the Apes

 

Best Screenplay:

Adapted: Molly's Game

Original: Three Billboards

 

Make sure to check out part 1 for some other predictions: www.flickr.com/photos/antdude3001/25713366787/in/datepost...

 

Also, if you want to check out my predictions for all the Oscar categories, check out my Letterboxd list:

letterboxd.com/antman3000/list/2018-oscars-predictions/

What are your predictions? Leave them down in the comments below!

Gordon E. Moore, the Intel Co-Founder Behind Moore’s Law, Dies at 94

His prediction in the 1960s about exponential advances in computer chip technology charted a course for the age of high tech.

 

Image

Gordon E. Moore in 1990 at the Silicon Valley headquarters of Intel, which he founded in 1968 with Robert Noyce. Credit...Alamy

By Holcomb B. Noble and Katie Hafner

March 24, 2023, 9:36 p.m. ET

6 MIN READ

Gordon E. Moore, a co-founder and former chairman of Intel Corporation, the California semiconductor chip maker that helped give Silicon Valley its name, achieving the kind of industrial dominance once held by the giant American railroad or steel companies of another age, died on Friday at his home in Hawaii. He was 94.

 

His death was confirmed by Intel and the Gordon and Betty Moore Foundation. They did not provide a cause.

 

Along with a handful of colleagues, Mr. Moore could claim credit for bringing laptop computers to hundreds of millions of people and embedding microprocessors into everything from bathroom scales, toasters and toy fire engines to cellphones, cars and jets.

 

Mr. Moore, who had wanted to be a teacher but could not get a job in education and later called himself the Accidental Entrepreneur, became a billionaire as a result of an initial $500 investment in the fledgling microchip business, which turned electronics into one of the world’s largest industries.

 

Story continues below advertisement

 

And it was he, his colleagues said, who saw the future. In 1965, in what became known as Moore’s Law, he predicted that the number of transistors that could be placed on a silicon chip would double at regular intervals for the foreseeable future, thus increasing the data-processing power of computers exponentially.

 

He added two corollaries later: The evolving technology would make computers more and more expensive to build, yet consumers would be charged less and less for them because so many would be sold. Moore’s Law held up for decades.

 

Through a combination of Mr. Moore’s brilliance, leadership, charisma and contacts, as well as that of his partner and Intel co-founder, Robert Noyce, the two assembled a group widely regarded by many as among the boldest and most creative technicians of the high-tech age.

 

This was the group that advocated the use of the thumbnail-thin chips of silicon, a highly polished, chemically treated sandy substance — one of the most common natural resources on earth — because of what turned out to be silicon’s amazing hospitality in housing smaller and smaller electronic circuitry that could work at higher and higher speeds.

 

Story continues below advertisement

 

With its silicon microprocessors, the brains of a computer, Intel enabled American manufacturers in the mid-1980s to regain the lead in the vast computer data-processing field from their formidable Japanese competitors. By the ’90s, Intel had placed its microprocessors in 80 percent of the computers that were being made worldwide, becoming the most successful semiconductor company in history.

 

Much of his happened under Mr. Moore’s watch. He was chief executive from 1975 to 1987, when Andrew Grove succeeded him, and remained as chairman until 1997.

 

As his wealth grew, Mr. Moore also became a major figure in philanthropy. In 2001, he and his wife created the Gordon and Betty Moore Foundation with a donation of 175 million Intel shares. In 2001, they donated $600 million to the California Institute of Technology, the largest single gift to an institution of higher learning at the time. The foundation’s assets currently exceed $8 billion and it has given away more than $5 billion since its founding.

 

In interviews, Mr. Moore was characteristically humble about his achievements, particularly the technical advances that Moore’s Law made possible.

 

Story continues below advertisement

 

“What I could see was that semiconductor devices were the way electronics were going to become cheap. That was the message I was trying to get across,” he told the journalist Michael Malone in 2000. “It turned out to be an amazingly precise prediction — a lot more precise than I ever imagined it would be.”

 

Not only was Mr. Moore predicting that electronics would become much cheaper over time, as the industry shifted from away from discrete transistors and tubes to silicon microchips, but over the years his prediction proved so reliable that technology firms based their product strategy on the assumption that Moore’s Law would hold.

 

“Any business doing rational multiyear planning had to assume this rate of change or else get steamrolled,” said Harry Saal, a longtime Silicon Valley entrepreneur.

 

“That’s his legacy,” said Arthur Rock, an early investor in Intel and friend of Mr. Moore’s. “It’s not Intel. It’s not the Moore Foundation. It’s that phrase: Moore’s Law.”

 

Image

Mr. Moore during Intel’s early days. His prediction, a few years earlier, that the number of transistors that could be placed on a silicon chip would double at regular intervals became known as Moore’s Law.

Mr. Moore during Intel’s early days. His prediction, a few years earlier, that the number of transistors that could be placed on a silicon chip would double at regular intervals became known as Moore’s Law.Credit...Intel

Gordon Earl Moore was born on Jan. 3, 1929, in San Francisco. He grew up in Pescadero, a small coastal town south of San Francisco, where his father, Walter H. Moore, was deputy sheriff and the family of his mother, the former Florence Almira Williamson, ran the general store.

 

Mr. Moore enrolled at San Jose State College (now San José State University), where he met Betty Whitaker, a journalism student. They married in 1950. That year, he completed his undergraduate studies at the University of California, Berkeley, with a degree in chemistry. In 1954, he received his doctorate, also in chemistry, from the Caltech.

 

One of the first jobs he applied for was as a manager with Dow Chemical. “They sent me to a psychologist to see how this would fit,” Mr. Moore wrote in 1994. “The psychologist said I was OK technically but I’d never manage anything.”

 

So Mr. Moore took a position with the Applied Physics Laboratory at Johns Hopkins University in Maryland. Then, looking for a way back to California, he interviewed at Lawrence Livermore Laboratory in Livermore, Calif. He was offered a job, “but I decided I didn’t want to take spectra of exploding nuclear bombs, so I turned it down,” he wrote.

 

Instead, in 1956, Mr. Moore joined William Shockley, the inventor of the transistor, to work at a West Coast division of Bell Laboratories, a start-up unit whose aim was to make a cheap silicon transistor.

 

But the company, Shockley Semiconductor, foundered under Mr. Shockley, who had no experience running a company. In 1957, Mr. Moore and Mr. Noyce joined a group of defectors who came to be known as “the traitorous eight.” With each putting in $500, along with $1.3 million in backing from the aircraft pioneer Sherman Fairchild, the eight men left to form the Fairchild Semiconductor Corporation, which became a pioneer in manufacturing integrated circuits.

 

Bitten by the entrepreneurial bug, Mr. Moore and Mr. Noyce decided in 1968 to form their own company, focusing on semiconductor memory. They wrote what Mr. Moore described as a “very general” business plan.

 

“It said we were going to work with silicon … and make interesting products,” he said in an interview in 1994.

 

Their vague proposal notwithstanding, they had no trouble finding financial backing.

 

With $2.5 million in capital, Mr. Moore and Mr. Noyce called their start-up Integrated Electronics Corporation, and later shortened it to Intel. The third employee was Mr. Grove, a young Hungarian immigrant who had worked under Mr. Moore at Fairchild.

 

After some indecision around what technology to focus on, the three men settled on a new version of MOS — metal oxide semiconductor — technology called silicon-gate MOS. To improve a transistor’s speed and density, they used silicon instead of aluminum.

 

“Fortunately, very much by luck, we had hit on a technology that had just the right degree of difficulty for a successful start-up,” Mr. Moore wrote in 1994. “This was how Intel began.”

 

In the early 1970s, Intel’s 4000 series “computer on a chip” began the revolution in personal computers, although Intel itself missed the opportunity to manufacture a PC, which Mr. Moore blamed partly on his own shortsightedness.

 

“Long before Apple, one of our engineers came to me with the suggestion that Intel ought to build a computer for the home,” he wrote. “And I asked him, ‘What the heck would anyone want a computer for in his home?”

 

Image

Mr. Moore holding a silicon wafer in 2005. Silicon was a key to Intel’s success.

Mr. Moore holding a silicon wafer in 2005. Silicon was a key to Intel’s success.Credit...Paul Sakuma/Associated Press

Still, he saw the future. In 1963, while still at Fairchild as director of research and development, Mr. Moore contributed a book chapter describing what was to become the precursor to his eponymous law, without the explicit numerical prediction. Two years later, he published an article in Electronics, a widely circulated trade magazine, titled, “Cramming More Components Onto Integrated Circuits.”

 

“The article presented the same argument as the book chapter, with the addition of this explicitly numerical prediction,” said David Brock, a co-author of “Moore’s Law: The Life of Gordon Moore, Silicon Valley’s Quiet Revolutionary.”

 

There is little evidence that many people read the article when it was published, Mr. Brock said.

 

“He kept giving talks with these charts and plots, and people started using his slides and reproducing his graphs,” Mr. Brock said. “Then people saw the phenomenon happen. Silicon microchips got more complex, and their cost went down.”

 

In the 1960s, when Mr. Moore began in electronics, a single silicon transistor sold for $150. Later, $10 would buy more than 100 million transistors. Mr. Moore once wrote that if cars advanced as quickly as computers, “they would get 100,000 miles to the gallon and it would be cheaper to buy a Rolls-Royce than park it. (Cars would also be a half an inch long.)”

 

Mr. Moore’s survivors include his wife, and his sons Kenneth and Steven, as well as four grandchildren.

 

In 2014, Forbes estimated Mr. Moore’s net worth at $7 billion. Yet he remained unprepossessing throughout his life, preferring tattered shirts and khakis to tailored suits. He shopped at Costco and kept a collection of fly lures and fishing reels on his office desk.

 

Moore’s Law is bound to reach its end, as engineers encounter some basic physical limits, as well as the extreme cost of building factories to achieve the next level of miniaturization. And in recent years, the pace of miniaturization has slowed.

 

Mr. Moore himself commented from time to time on the inevitable end to Moore’s Law. “It can’t continue forever,” he said in a 2005 interview with Techworld magazine. “The nature of exponentials is that you push them out and eventually disaster happens.”

 

Holcomb B. Noble, a former science editor for The Times, died in 2017.

Katie Hafner, a former staff reporter for The New York Times, is a co-author of "Where Wizards Stay Up Late: The Origins of The Internet."

 

How The Times decides who gets an obituary. If you made news in life, chances are your death is news, too. There is no formula, scoring system or checklist. We investigate, research and ask around before settling on our subjects.

Learn more about our process.

Recommended Newsletters

 

times journeys special offer

Times Journeys

Cinema-Phono-Telegraphic Correspondence

 

This gallery depicts a series of futuristic pictures by the French painter Jean-Marc Côté and other artists issued in 1899, 1900, 1901 and 1910. Originally in the form of paper cards enclosed in cigarette/cigar boxes and, later, as postcards, the images described the world as it was imagined to be like in the then distant year of 2000.

 

At least 87 were produced, and I have managed to capture 73 of them 😊. While a few were on point (A version of Skype or Facetime), many were wildly off-tangent (underwater croquet, anyone?). And all are definitely worth a look!

 

Sources: All images are in the public domain; Most were obtained from gallica.bnf.fr/

, although I had to edit a few to render them in higher resolution.

prediction: rain, rain, and more rain

 

and they were right!

It's that time of the year again where The Oscars is this Sunday! I've seen a lot of the nominated films this year, so I feel pretty confident in my predictions. So here are my predictions for best film, best lead actor, best lead actress, best supporting actor, and best supporting actress!

 

Best Film: The Shape of Water

 

Best Lead Actor: Gary Oldman (Darkest Hour)

 

Best Lead Actress: Frances McDormand (Three Billboards)

 

Best Supporting Actor: Sam Rockwell (Three Billboards)

 

Best Supporting Actress: Allison Janney (I, Tonya)

 

Make sure to check out part 2 for some other predictions: www.flickr.com/photos/antdude3001/39689489585/in/datepost...

 

Also, if you want to check out my predictions for all the Oscar categories, check out my Letterboxd list:

letterboxd.com/antman3000/list/2018-oscars-predictions/

What are your predictions? Leave them down in the comments below!

predictions for tomorrow in southern Wisconsin: 2 inches? 4 inches? 8 inches?

 

Might as well be 2 feet!!!!

prediction of summer sunsets

The Hunt for Microbes

 

This gallery depicts a series of futuristic pictures by the French painter Jean-Marc Côté and other artists issued in 1899, 1900, 1901 and 1910. Originally in the form of paper cards enclosed in cigarette/cigar boxes and, later, as postcards, the images described the world as it was imagined to be like in the then distant year of 2000.

 

At least 87 were produced, and I have managed to capture 73 of them 😊. While a few were on point (A version of Skype or Facetime), many were wildly off-tangent (underwater croquet, anyone?). And all are definitely worth a look!

 

Sources: All images are in the public domain; Most were obtained from gallica.bnf.fr/

, although I had to edit a few to render them in higher resolution.

OM-2n | 55/1.2

 

© copyrighted

Aspen, Colorado

September 5, 1980

 

2020 - Visions of the Future - Roaring Fork Valley (Page 1 of 7)

 

Nick's 1980 predictions for the year 2020

 

Document courtesy of:

Aspen Historical Society, IDCA Time Capsule Collection

 

Text:

 

1

 

2020 - VISIONS OF THE FUTURE - ROARING FORK VALLEY

By Nick DeWolf, Friday, September 5 1980

 

I hope we’ll take seriously the concept of burying the results of this conference in a time capsule because forty years from now most of [us] will still be alive so we will enjoy seeing what fools we were. I originally intended to give you a jazzy slide show of pictures from OMNI Magazine of wonderful machines and inventions from the future, but such fun excursions into fantasy will prevent all of us from truly thinking about tomorrow. We picked 2020 because that’s perfect vision, knowing full well that we don’t even have a chance to be close. The only forecast we can make safely is that we will be wrong, but more importantly, looking at other seasoned forecasts, we will almost certainly completely miss the most important issues forty years from now. I’ve been in the fast changing semiconductor business and thirty years ago we made all kinds of forecasts, the most optimistic of us making our most bizarre and kookiest guesses were fifteen times too low. The explosive growth just plain blew up in our faces.

 

In 2020 most of us will be a mere 85. By then, however, many of us will still be in our prime, with 20 years to go. But 2020 is relatively a twink away.

The earth is 4,600 million years old.

Cellular life has been here most of that time.

Photosynthesis for 2,000 million years.

Quasi-man 2 million years.

Erectus a tenth of a million years.

The last glacier coincided roughly with the birth of

consciousness (some believe), agriculture, the Church, cities, factories

- all five to ten thousand years old.

Books, schools, divorces, Hell and democracy were invented between two and five thousand years ago.

We’ve been capable of self destruction for only about thirty years.

Half of our published literature is only six years old.

 

Futurists of the past have held to cyclical views, a convolution

thing, such as sunspot cycles (my Father’s favorite way of predicting the stock market), but the kind of rollercoaster we’re on now makes the cyclical view seem kind of silly.

Others are evolutionists, who think about trends, and extrapolate Pitkin County growth forty years from now via percentage growth rates.

But I believe in catastrophe theory - that the future will come by lurches and leaps and creaks up and down, and changes will be more revolutionary.

Above all, more than at any other time in the history of man, we control our own destiny. The incredible number of options we have are really within our control instead of nature’s.

Many of those who want to plot charts are stuck in measuring the quality of life with measurements like:

Air pollution; Gross National Product; Nuclear Radiation levels;

Bacteria counts.

I find that those measures of the quality of life don’t interest me much - what counts to me are things like:

Rewards; Happiness; Freedom; Spark; Elan; Spirit; Privacy; Self

Expression; Fulfillment - those kinds of things.

Therefore most are incapable of attacking the subject scientifically - three cheers!

 

part of an archival project, featuring the work of nick dewolf

 

© the Nick DeWolf Foundation

Requests for use are welcome via flickrmail or nickdewolfphotoarchive [at] gmail [dot] com

 

predictions for spiderman3?...........never!

Predictions of 200-1000 per hour did not pan out. Oh well beautiful night anyway. Bonus points! Reddish glow Northern lights!

Having just finished the book, I find myself “just thinking about the weather” (like 10,000 Maniacs).

 

Every decade, we have added one day to our forward weather forecast. So, today’s weekly forecast is as accurate as the 2-day forecast in the 70’s. In the first book on weather prediction — 100 years ago — Lewis Fry Richardson prophesied “perhaps someday in the dim future it will be possible to advance the computation faster than the weather advances, and at a cost less than the savings to mankind due to the information gained. But that is a dream.”

 

What a setup for Moore’s Law! More on that later.

 

Weather predictions started with telegraph networks in the 1860’s. News of a storm front could arrive by electrical signals faster than the wind itself. Those physical networks were interrupted by the civil war and the great wars. The observational stations drove short term forecasts driven by simple pattern matching; meteorologists flipped through maps of prior patterns to find one that looked similar, missing the nuances in the complex networks of interactions.

 

What was needed was a theory, a mathematics derived from first principles of the physics of atmospheric flows. Those equations, a collection of interlocking partial differential equations, across a matrix of pressure, temperature, air density, wind vectors and such, were first published in 1904 (and are the subject of the thick textbook below). They are practically unsolvable, but can be approximated with a variety of numerical / graphical methods and approximations (hydrostatic, anelastic, autobarotropic shallow fluid, etc.). New weather prediction models were then back tested on historical data, an iterative feedback cycle of learning from past to present.

 

The weather became important to ship traffic and battle planning, and forecasts were weaponized in wartime. The terminology of weather “fronts” traces to the martial vernacular of WW I. The Germans were at a distinct meteorological disadvantage, with storms coming from areas controlled by the Allied powers. Siemens developed automatic weather stations with NiCad batteries and radios that could be dropped off by plane in remote locations. With 200 submarines trying to maintain a blockade of England, the Germans desperately needed weather predictions for the North Atlantic. In 1943, they sent U-537 to an uninhabited part of North America, and set up a weather station on a local peak, with a long range 30-ft. diameter antenna to beam weather data back to Germany. To evade detection, they hand-painted “Canada Meteor Service” on the side and scattered American cigarette packs about. It remained there until discovered in 1981. Yes, the only known incursion by the Nazis onto North American soil was for the weather.

 

Then came the rockets. The first U.S. launch of a V-2 rocket brought back from Germany snapped a picture of the cloud cover as had never been seen before, with a quarter of the U.S. in a single frame. In 1954, an Aerobee rocket cam captured the first clear image of a tropical storm swirling in the Gulf of Mexico, and it became a full-page spread in Life magazine. (I have an Aerobee nose cone, fin can, and engine on display at work).

 

The first weather satellite, TIROS 1, launched in 1960, and in Kennedy’s famous speech that launched the Apollo program, he also beckoned “at the earliest possible time, a satellite system for worldwide weather observation.” It was overshadowed a bit by the whole man on the moon thing.

 

Today, the polar-orbiting LEO satellites raster scan the Earth (like Planet Labs) and “contribute the most quantitative data to the weather models. When it comes to meaningful impacts on forecasting, they are the champs.” (p.81). We have hundreds of LEO and GEO birds with a variety of weather instruments (optical, IR, radar) providing global coverage.

 

It’s a torrent of data, feeding supercomputers that are upgraded every two years. About half of the supercomputers on Earth are working on the weather. The European Center for Medium-Range Weather forecasts has two supercomputers the size of volleyball courts with 260,000 processor cores (in 2019). They maintain the current champion model for forecasting. They devote 50% of their compute cycles to iterating on model improvements (and the other 50% running the latest model for the world). They have improved their forecasts continuously for 40 years straight.

 

To build a global model, there are global sensors from many nations, all contributing to a public good. “WMO estimates put the economic value of weather services in excess of $100 billion annually, while the cost of providing them is a tenth of that.” (p.175) Still, a big number for a public good. “The weather machine is a last bastion of international cooperation.” (p.181)

 

P.S. The book is not nearly as gripping as the history of ammonia, and it ends abruptly without painting a picture of what’s next for Sim-Earth... with a proliferation of networked sensors and machine learning in the mix.

Tucker's predictions for Super Bowl 57 between

the Philadelphia Eagles and Kansas City Chiefs!

 

Tucker, Welsh Corgi brings down the stuffed bulldog

before he picks it up to run for a touchdown.

 

Tucker's prediction is the underdogs, Kansas City Chiefs,

to win!

  

Tucker picked the 2023 Super Bowl 57 Champions!

Yes, The Kansas City Chiefs win in a close, exciting game

down to the final seconds with a field goal.

Chiefs win 38 to 35!

Rue de la Loi, Brussels. Was reminded of this picture when I saw the video here:

www.bbc.co.uk/news/av/world-europe-68402627

It's that time of the year again where The Oscars is tomorrow! I did this back in 2015 and I forgot to do this again last year, but I didn't forget this time! And like I did last time, I'm not going to make my predictions on all the awards, only the big ones. So here's the biggy out of the two, best film, best lead actor, best lead actress, best supporting actor, and best supporting actress!

 

Best Film: La La Land

 

Best Lead Actor: Casey Affleck (Manchester By the Sea)

 

Best Lead Actress: Emma Stone (La La Land)

 

Best Supporting Actor: Mahershala Ali (Moonlight)

 

Best Supporting Actress: Viola Davis (Fences)

 

Make sure to check out part 2 for some other predictions: www.flickr.com/photos/antdude3001/32961603142/in/datepost...

What are your predictions? Leave them down in the comments below!

 

you will wake up with a terrible headache.

It's that time of the year again where The Oscars is tomorrow! I did this back in 2015 and I forgot to do this again last year, but I didn't forget this time! And like I did last time, I'm not going to make my predictions on all the awards, only the big ones. So here are my predictions for best director, best score, best animated film, best visual effects, best adapted screenplay, and best original screenplay.

 

Best Director: Denis Villeneuve (Arrival)

 

Best Score: La La Land

 

Best Animated Film: Moana (though I wouldn't be surprised if it's Zootopia, this one was kinda tough to call)

 

Best Visual Effects: The Jungle Book (But I'm holding out for Doctor Strange!)

 

Best Screenplay:

Adapted: Arrival

Original: Manchester by the Sea

 

Make sure to check out part 1 for some other predictions: www.flickr.com/photos/antdude3001/33076494876/in/datepost...

What are your predictions? Leave them down in the comments below!

Pentax K-3

DA 15mm 4.0 Limited

Out of camera jpeg with digital filters

His scientific works include a collaboration with Roger Penrose on gravitational singularity theorems in the framework of general relativity and the theoretical prediction that black holes emit radiation, often called Hawking radiation. Hawking was the first to set out a theory of cosmology explained by a union of the general theory of relativity and quantum mechanics. He is a vigorous supporter of the many-worlds interpretation of quantum mechanics.

 

Hawking is an Honorary Fellow of the Royal Society of Arts, a lifetime member of the Pontifical Academy of Sciences, and a recipient of the Presidential Medal of Freedom, the highest civilian award in the US. In 2002, Hawking was ranked number 25 in the BBC's poll of the 100 Greatest Britons. He was the Lucasian Professor of Mathematics at the University of Cambridge between 1979 and 2009 and has achieved commercial success with works of popular science in which he discusses his own theories and cosmology in general; his book A Brief History of Time appeared on the British Sunday Times best-seller list for a record-breaking 237 weeks.

 

Hawking has a rare early-onset, slow-progressing form of amyotrophic lateral sclerosis (ALS) that has gradually paralysed him over the decades. He now communicates using a single cheek muscle attached to a speech-generating device.

  

PRIMARY and SECONDARY SCHOOL YEARS

 

Hawking began his schooling at the Byron House School in Highgate, London. He later blamed its "progressive methods" for his failure to learn to read while at the school.In St Albans, the eight-year-old Hawking attended St Albans High School for Girls for a few months. At that time, younger boys could attend one of the houses.

Hawking attended Radlett School, an independent school in the village of Radlett in Hertfordshire, for a year, and from September 1952, St Albans School, an independent school in the city of St Albans in Hertfordshire. The family placed a high value on education. Hawking's father wanted his son to attend the well-regarded Westminster School, but the 13-year-old Hawking was ill on the day of the scholarship examination. His family could not afford the school fees without the financial aid of a scholarship, so Hawking remained at St Albans. A positive consequence was that Hawking remained with a close group of friends with whom he enjoyed board games, the manufacture of fireworks, model aeroplanes and boats, and long discussions about Christianity and extrasensory perception. From 1958 on, with the help of the mathematics teacher Dikran Tahta, they built a computer from clock parts, an old telephone switchboard and other recycled components.

Although known at school as "Einstein", Hawking was not initially successful academically. With time, he began to show considerable aptitude for scientific subjects and, inspired by Tahta, decided to read mathematics at university. Hawking's father advised him to study medicine, concerned that there were few jobs for mathematics graduates. He also wanted his son to attend University College, Oxford, his own alma mater. As it was not possible to read mathematics there at the time, Hawking decided to study physics and chemistry. Despite his headmaster's advice to wait until the next year, Hawking was awarded a scholarship after taking the examinations in March 1959.

  

UNDERGRADUATE YEARS

 

Hawking began his university education at University College, Oxford in October 1959 at the age of 17. For the first 18 months, he was bored and lonely – he was younger than many of the other students, and found the academic work "ridiculously easy". His physics tutor, Robert Berman, later said, "It was only necessary for him to know that something could be done, and he could do it without looking to see how other people did it." A change occurred during his second and third year when, according to Berman, Hawking made more of an effort "to be one of the boys". He developed into a popular, lively and witty college member, interested in classical music and science fiction. Part of the transformation resulted from his decision to join the college boat club, the University College Boat Club, where he coxed a rowing team. The rowing trainer at the time noted that Hawking cultivated a daredevil image, steering his crew on risky courses that led to damaged boats.

Hawking has estimated that he studied about a thousand hours during his three years at Oxford. These unimpressive study habits made sitting his finals a challenge, and he decided to answer only theoretical physics questions rather than those requiring factual knowledge. A first-class honours degree was a condition of acceptance for his planned graduate study in cosmology at the University of Cambridge. Anxious, he slept poorly the night before the examinations, and the final result was on the borderline between first- and second-class honours, making a viva (oral examination) necessary. Hawking was concerned that he was viewed as a lazy and difficult student. So, when asked at the oral to describe his future plans, he said, "If you award me a First, I will go to Cambridge. If I receive a Second, I shall stay in Oxford, so I expect you will give me a First." He was held in higher regard than he believed; as Berman commented, the examiners "were intelligent enough to realise they were talking to someone far cleverer than most of themselves". After receiving a first-class BA (Hons.) degree in natural science and completing a trip to Iran with a friend, he began his graduate work at Trinity Hall, Cambridge, in October 1962.

  

GRADUATE YEARS

 

Hawking's first year as a doctoral student was difficult. He was initially disappointed to find that he had been assigned Dennis William Sciama, one of the founders of modern cosmology, as a supervisor rather than noted astronomer Fred Hoyle, and he found his training in mathematics inadequate for work in general relativity and cosmology. After being diagnosed with motor neurone disease, Hawking fell into a depression – though his doctors advised that he continue with his studies, he felt there was little point. However, his disease progressed more slowly than doctors had predicted. Although Hawking had difficulty walking unsupported, and his speech was almost unintelligible, an initial diagnosis that he had only two years to live proved unfounded. With Sciama's encouragement, he returned to his work. Hawking started developing a reputation for brilliance and brashness when he publicly challenged the work of Fred Hoyle and his student Jayant Narlikar at a lecture in June 1964.

When Hawking began his graduate studies, there was much debate in the physics community about the prevailing theories of the creation of the universe: the Big Bang and Steady State theories. Inspired by Roger Penrose's theorem of a spacetime singularity in the centre of black holes, Hawking applied the same thinking to the entire universe; and, during 1965, he wrote his thesis on this topic. There were other positive developments: Hawking received a research fellowship at Gonville and Caius College; he obtained his PhD degree in applied mathematics and theoretical physics, specialising in general relativity and cosmology, in March 1966; and his essay entitled "Singularities and the Geometry of Space-Time" shared top honours with one by Penrose to win that year's prestigious Adams Prize.

  

CAREER

 

1966–1975

In his work, and in collaboration with Penrose, Hawking extended the singularity theorem concepts first explored in his doctoral thesis. This included not only the existence of singularities but also the theory that the universe might have started as a singularity. Their joint essay was the runner-up in the 1968 Gravity Research Foundation competition. In 1970 they published a proof that if the universe obeys the general theory of relativity and fits any of the models of physical cosmology developed by Alexander Friedmann, then it must have begun as a singularity. In 1969, Hawking accepted a specially created Fellowship for Distinction in Science to remain at Caius.

In 1970, Hawking postulated what became known as the second law of black hole dynamics, that the event horizon of a black hole can never get smaller.[83] With James M. Bardeen and Brandon Carter, he proposed the four laws of black hole mechanics, drawing an analogy with thermodynamics. To Hawking's irritation, Jacob Bekenstein, a graduate student of John Wheeler, went further—and ultimately correctly—to apply thermodynamic concepts literally.[85][86] In the early 1970s, Hawking's work with Carter, Werner Israel and David C. Robinson strongly supported Wheeler's no-hair theorem that no matter what the original material from which a black hole is created, it can be completely described by the properties of mass, electrical charge and rotation.[87][88] His essay titled "Black Holes" won the Gravity Research Foundation Award in January 1971.[89] Hawking's first book, The Large Scale Structure of Space-Time, written with George Ellis, was published in 1973.

Beginning in 1973, Hawking moved into the study of quantum gravity and quantum mechanics. His work in this area was spurred by a visit to Moscow and discussions with Yakov Borisovich Zel'dovich and Alexei Starobinsky, whose work showed that according to the uncertainty principle, rotating black holes emit particles. To Hawking's annoyance, his much-checked calculations produced findings that contradicted his second law, which claimed black holes could never get smaller,and supported Bekenstein's reasoning about their entropy.His results, which Hawking presented from 1974, showed that black holes emit radiation, known today as Hawking radiation, which may continue until they exhaust their energy and evaporate. Initially, Hawking radiation was controversial. However, by the late 1970s and following the publication of further research, the discovery was widely accepted as a significant breakthrough in theoretical physics. Hawking was elected a Fellow of the Royal Society (FRS) in 1974, a few weeks after the announcement of Hawking radiation. At the time, he was one of the youngest scientists to become a Fellow.

Hawking was appointed to the Sherman Fairchild Distinguished visiting professorship at the California Institute of Technology (Caltech) in 1970. He worked with a friend on the faculty, Kip Thorne, and engaged him in a scientific wager about whether the dark star Cygnus X-1 was a black hole. The wager was an "insurance policy" against the proposition that black holes did not exist. Hawking acknowledged that he had lost the bet in 1990, which was the first of several that he was to make with Thorne and others.Hawking has maintained ties to Caltech, spending a month there almost every year since this first visit.

 

1975–1990

Hawking returned to Cambridge in 1975 to a more academically senior post, as reader in gravitational physics. The mid to late 1970s were a period of growing public interest in black holes and of the physicists who were studying them. Hawking was regularly interviewed for print and television. He also received increasing academic recognition of his work. In 1975, he was awarded both the Eddington Medal and the Pius XI Gold Medal, and in 1976 the Dannie Heineman Prize, the Maxwell Prize and the Hughes Medal. He was appointed a professor with a chair in gravitational physics in 1977. The following year he received the Albert Einstein Medal and an honorary doctorate from the University of Oxford.

In the late 1970s, Hawking was elected Lucasian Professor of Mathematics at the University of Cambridge.His inaugural lecture as Lucasian Professor of Mathematics was titled: "Is the End in Sight for Theoretical Physics" and proposed N=8 Supergravity as the leading theory to solve many of the outstanding problems physicists were studying. His promotion coincided with a health crisis which led to his accepting, albeit reluctantly, some nursing services at home. At the same time, he was also making a transition in his approach to physics, becoming more intuitive and speculative rather than insisting on mathematical proofs. "I would rather be right than rigorous", he told Kip Thorne. In 1981, he proposed that information in a black hole is irretrievably lost when a black hole evaporates. This information paradox violates the fundamental tenet of quantum mechanics, and led to years of debate, including "the Black Hole War" with Leonard Susskind and Gerard 't Hooft.

Cosmological inflation – a theory proposing that following the Big Bang, the universe initially expanded incredibly rapidly before settling down to a slower expansion – was proposed by Alan Guth and also developed by Andrei Linde. Following a conference in Moscow in October 1981, Hawking and Gary Gibbons organized a three-week Nuffield Workshop in the summer of 1982 on "The Very Early Universe" at Cambridge University, which focused mainly on inflation theory. Hawking also began a new line of quantum theory research into the origin of the universe. In 1981 at a Vatican conference, he presented work suggesting that there might be no boundary – or beginning or ending – to the universe. He subsequently developed the research in collaboration with Jim Hartle, and in 1983 they published a model, known as the Hartle–Hawking state. It proposed that prior to the Planck epoch, the universe had no boundary in space-time; before the Big Bang, time did not exist and the concept of the beginning of the universe is meaningless. The initial singularity of the classical Big Bang models was replaced with a region akin to the North Pole. One cannot travel north of the North Pole, but there is no boundary there – it is simply the point where all north-running lines meet and end. Initially, the no-boundary proposal predicted a closed universe, which had implications about the existence of God. As Hawking explained, "If the universe has no boundaries but is self-contained... then God would not have had any freedom to choose how the universe began."

Hawking did not rule out the existence of a Creator, asking in A Brief History of Time "Is the unified theory so compelling that it brings about its own existence?" In his early work, Hawking spoke of God in a metaphorical sense. In A Brief History of Time he wrote: "If we discover a complete theory, it would be the ultimate triumph of human reason – for then we should know the mind of God." In the same book he suggested that the existence of God was not necessary to explain the origin of the universe. Later discussions with Neil Turok led to the realisation that the existence of God was also compatible with an open universe.

Further work by Hawking in the area of arrows of time led to the 1985 publication of a paper theorising that if the no-boundary proposition were correct, then when the universe stopped expanding and eventually collapsed, time would run backwards. A paper by Don Page and independent calculations by Raymond Laflamme led Hawking to withdraw this concept. Honours continued to be awarded: in 1981 he was awarded the American Franklin Medal, and in 1982 made a Commander of the Order of the British Empire (CBE). Awards do not pay the bills, however, and motivated by the need to finance the children's education and home expenses, in 1982 Hawking determined to write a popular book about the universe that would be accessible to the general public. Instead of publishing with an academic press, he signed a contract with Bantam Books, a mass market publisher, and received a large advance for his book. A first draft of the book, called A Brief History of Time, was completed in 1984.

One of the first messages Hawking produced with his speech-generating device was a request for his assistant to help him finish writing A Brief History of Time. Peter Guzzardi, his editor at Bantam, pushed him to explain his ideas clearly in non-technical language, a process that required many revisions from an increasingly irritated Hawking. The book was published in April 1988 in the US and in June in the UK, and it proved to be an extraordinary success, rising quickly to the top of bestseller lists in both countries and remaining there for months. The book was translated into many languages, and ultimately sold an estimated 9 million copies. Media attention was intense, and a Newsweek magazine cover and a television special both described him as "Master of the Universe". Success led to significant financial rewards, but also the challenges of celebrity status. Hawking travelled extensively to promote his work, and enjoyed partying and dancing into the small hours. He had difficulty refusing the invitations and visitors, which left limited time for work and his students. Some colleagues were resentful of the attention Hawking received, feeling it was due to his disability. He received further academic recognition, including five more honorary degrees,[149] the Gold Medal of the Royal Astronomical Society (1985), the Paul Dirac Medal (1987) and, jointly with Penrose, the prestigious Wolf Prize (1988). In 1989, he was appointed Member of the Order of the Companions of Honour (CH). He reportedly declined a knighthood.

  

1990–2000

Hawking pursued his work in physics: in 1993 he co-edited a book on Euclidean quantum gravity with Gary Gibbons and published a collected edition of his own articles on black holes and the Big Bang. In 1994, at Cambridge's Newton Institute, Hawking and Penrose delivered a series of six lectures that were published in 1996 as "The Nature of Space and Time". In 1997, he conceded a 1991 public scientific wager made with Kip Thorne and John Preskill of Caltech. Hawking had bet that Penrose's proposal of a "cosmic censorship conjecture" – that there could be no "naked singularities" unclothed within a horizon – was correct. After discovering his concession might have been premature, a new, more refined, wager was made. This one specified that such singularities would occur without extra conditions. The same year, Thorne, Hawking and Preskill made another bet, this time concerning the black hole information paradox. Thorne and Hawking argued that since general relativity made it impossible for black holes to radiate and lose information, the mass-energy and information carried by Hawking radiation must be "new", and not from inside the black hole event horizon. Since this contradicted the quantum mechanics of microcausality, quantum mechanics theory would need to be rewritten. Preskill argued the opposite, that since quantum mechanics suggests that the information emitted by a black hole relates to information that fell in at an earlier time, the concept of black holes given by general relativity must be modified in some way.

Hawking also maintained his public profile, including bringing science to a wider audience. A film version of A Brief History of Time, directed by Errol Morris and produced by Steven Spielberg, premiered in 1992. Hawking had wanted the film to be scientific rather than biographical, but he was persuaded otherwise. The film, while a critical success, was, however, not widely released. A popular-level collection of essays, interviews, and talks titled Black Holes and Baby Universes and Other Essays was published in 1993, and a six-part television series Stephen Hawking's Universe and a companion book appeared in 1997. As Hawking insisted, this time the focus was entirely on science.

  

2000–present

 

Hawking continued his writings for a popular audience, publishing The Universe in a Nutshell in 2001, and A Briefer History of Time, which he wrote in 2005 with Leonard Mlodinow to update his earlier works with the aim of making them accessible to a wider audience, and God Created the Integers, which appeared in 2006. Along with Thomas Hertog at CERN and Jim Hartle, from 2006 on Hawking developed a theory of "top-down cosmology", which says that the universe had not one unique initial state but many different ones, and therefore that it is inappropriate to formulate a theory that predicts the universe's current configuration from one particular initial state. Top-down cosmology posits that the present "selects" the past from a superposition of many possible histories. In doing so, the theory suggests a possible resolution of the fine-tuning question.

Hawking continued to travel widely, including trips to Chile, Easter Island, South Africa, Spain (to receive the Fonseca Prize in 2008),] Canada, and numerous trips to the United States. For practical reasons related to his disability, Hawking increasingly travelled by private jet, and by 2011 that had become his only mode of international travel. By 2003, consensus among physicists was growing that Hawking was wrong about the loss of information in a black hole. In a 2004 lecture in Dublin, he conceded his 1997 bet with Preskill, but described his own, somewhat controversial solution to the information paradox problem, involving the possibility that black holes have more than one topology. In the 2005 paper he published on the subject, he argued that the information paradox was explained by examining all the alternative histories of universes, with the information loss in those with black holes being cancelled out by those without such loss. In January 2014 he called the alleged loss of information in black holes his "biggest blunder".

As part of another longstanding scientific dispute, Hawking had emphatically argued, and bet, that the Higgs boson would never be found.[182] The particle was proposed to exist as part of the Higgs field theory by Peter Higgs in 1964. Hawking and Higgs engaged in a heated and public debate over the matter in 2002 and again in 2008, with Higgs criticising Hawking's work and complaining that Hawking's "celebrity status gives him instant credibility that others do not have." The particle was discovered in July 2012 at CERN following construction of the Large Hadron Collider. Hawking quickly conceded that he had lost his bet and said that Higgs should win the Nobel Prize for Physics, which he did in 2013.

 

In 2007, Hawking and his daughter Lucy published George's Secret Key to the Universe, a children's book designed to explain theoretical physics in an accessible fashion and featuring characters similar to those in the Hawking family.[188] The book was followed by sequels in 2009, 2011 and 2014.

In 2002, following a UK-wide vote, the BBC included Hawking in their list of the 100 Greatest Britons.[190] He was awarded the Copley Medal from the Royal Society (2006), the Presidential Medal of Freedom, which is America's highest civilian honour (2009), and the Russian Special Fundamental Physics Prize (2013).

Several buildings have been named after him, including the Stephen W. Hawking Science Museum in San Salvador, El Salvador, the Stephen Hawking Building in Cambridge, and the Stephen Hawking Centre at the Perimeter Institute in Canada.Appropriately, given Hawking's association with time, he unveiled the mechanical "Chronophage" (or time-eating) Corpus Clock at Corpus Christi College Cambridge in September 2008.

During his career, Hawking has supervised 39 successful PhD students. As required by Cambridge University regulations, Hawking retired as Lucasian Professor of Mathematics in 2009. Despite suggestions that he might leave the United Kingdom as a protest against public funding cuts to basic scientific research, Hawking has continued to work as director of research at the Cambridge University Department of Applied Mathematics and Theoretical Physics, and indicated in 2012 that he had no plans to retire.

On 28 June 2009, as a tongue-in-cheek test of his 1992 conjecture that travel into the past is effectively impossible, Hawking held a party open to all, complete with hors d'oeuvres and iced champagne, but only publicized the party after it was over so that only time-travellers would know to attend; as expected, nobody showed up to the party.

On 20 July 2015, Hawking helped launch Breakthrough Initiatives, an effort to search for extraterrestrial life. In 2015, Richard Branson offered Stephen Hawking a seat on the Virgin Galactic spaceship for free. While no hard date has been set for launch, Virgin Galactic's SpaceShipTwo is slated to launch at the end of 2017. At 75, Hawking will not be the oldest person ever to go to space (John Glenn returned to space at age 77), but he will be the first person to go to space with amyotrophic lateral sclerosis (ALS). While this will be Hawking's first time in space, it will not be the first time he will have experienced weightlessness: in 2007, he had flown into zero gravity aboard a specially-modified Boeing 727-200 aircraft. Hawking created Stephen Hawking: Expedition New Earth, a documentary on space colonization, as a summer 2017 episode of Tomorrow's World.

In August 2015, Hawking said that not all information is lost when something enters a black hole and there might be a possibility to retrieve information from a black hole according to his theory.

LONDON, ENGLAND - MARCH 23: Emma Tucker on stage for the Predictions For The Future Of Media, during Advertising Week Europe, Piccadilly, on March 23, 2015 in London, England. (Photo by Stuart Wilson/Getty Images for Advertising Week)

weather predictions said it's a clear night, so just could not miss the night to go out and shoot the sky. Had to go out and capture some data for the coming weeks as soon the moon will be out and you never know if the weather will be kind enough in future.

 

Went out to warkworth satellite station last night, which is a pretty decent dark spot, though you do have some light pollution from auckland in south and warkworth in north.

 

In this image the light pollution can be seen on both sides of the image, left being auckland and warkworth at right. You can see the Large and Small Magellan clouds to the left of the image and near the center if you look at the horizon you can see some green air glow.

 

So, the best area to shoot was west and overhead. Had a go at some long run data capture of the scorpio, but the conditions were not that great, there was moisture in the air and every now and then the lens fogged up, so we were getting on an average 3 shots before we had to defog the lens.

 

After all the data capture and as the MW was heading down, the clouds rolled in and we decided to packup, but before leaving had to go for a panorama, i thought the clouds would ruin it and cause banding issues but they turned out great.

 

Napoleon came out of his house this morning sporting the Valentine's day haircut, and sang his 'spring song' for the first time in 2014. He doesn't receive the weather forecast, but the area is predicted to reach 40 degrees tomorrow; the first time above freezing in many months. He hasn't been wrong with his signal in years. Once he was assured his picture had been taken at the proper angle (he looks taller this way) Napoleon grabbed lunch and flew inside his tree for a nap.

1 2 4 6 7 ••• 79 80