New 3D map shows large scale structures in the universe 9 billion years ago

New 3D map shows large scale structures in the universe 9 billion years ago

The FastSound project’s 3D map of the large-scale structure of a region in the Universe about 4.7 billion years after the Big Bang. This area covers 2.5 times 3 degrees of the sky, with a radial distance spanning 12-14.5 billion light years in co-moving distance or 8-9.6 billion light years in light travel distance. Credit: NAOJ, SDSS, CFHT.

The FastSound project’s 3D map of the large-scale structure of a region in the Universe about 4.7 billion years after the Big Bang. This area covers 2.5 times 3 degrees of the sky, with a radial distance spanning 12-14.5 billion light years in co-moving distance or 8-9.6 billion light years in light travel distance. Credit: NAOJ, SDSS, CFHT.

I remember seeing the Hubble 3-D IMAX movie in 2010 and literally gasping when the view pulled back from zooming into distant stars and galaxies to show clusters and superclusters of galaxies interwoven like a web, creating the large scale structure of the Universe. In 3-D, the structure looks much like the DNA double helix or a backbone.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

Now, a new project that aims to map the Universe’s structure has looked back in time to create a 3-D map showing a portion of the Universe as it looked nine billion years ago. It shows numerous galaxies and interestingly, already-developed large-scale structure of filaments and voids made from galaxy groups.

The map was created by the FastSound project, which is surveying galaxies in the Universe using the Subaru Telescope’s new Fiber Multi-Object Spectrograph (FMOS). The team doing the work is from Kyoto University, the University of Tokyo and the University of Oxford.

The team said that although they can see that the clustering of galaxies is not as strong back when the Universe was 4.7 billion years old as it is in the present-day Universe, gravitational interaction will eventually result in clustering that grows to the current level.

The new map spans 600 million light years along the angular direction and two billion light years in the radial direction. The team will eventually survey a region totaling about 30 square degrees in the sky and then measure precise distances to about 5,000 galaxies that are more than ten billion light years away.

This is not the first 3-D map of the Universe: the Sloan Digital Sky Survey created one in 2006 with coverage up to five billion light years away, and it was updated just last year, and a video flythough was created, which you can watch above. Also, earlier this year the University of Hawaii created a 3-D video map showing large scale cosmic structure out to 300 million light years.

But the FastSound project hopes to create a 3-D map of the very distant Universe by covering the volume of the Universe farther than ten billion light years away. FMOS is a wide-field spectroscopy system that enables near-infrared spectroscopy of over 100 objects at a time, with an exceptionally wide field of view when combined with the light collecting power of the 8.2 meter primary mirror of the telescope.

The map released today is just the first from FastSound. The final 3-D map of the distant Universe will precisely measure the motion of galaxies and then measure the rate of growth of the large-scale structure as a test of Einstein’s general theory of relativity.

Although scientists know that the expansion of the Universe is accelerating, they do not know why – whether it is from dark energy or whether gravity on cosmological scales may differ from that of general relativity, this mystery is one of the biggest questions in contemporary physics and astronomy. A comparison of the 3D map of the young Universe with the predictions of general relativity could eventually reveal the mechanism for the mysterious acceleration of the Universe.

The team said their 3-D map shown in this release uses a measure of “comoving” distance rather than light travel distance. They explained:

Light travel distance refers to the time that has elapsed from the epoch of the observed distant galaxy to the present, multiplied by the speed of light. Since the speed of light is always constant for any observer, it describes the distance of the path that a photon has traveled. However, the expansion of the Universe increases the length of the path that the photon traveled in the past. Comoving distance, the geometrical distance in the current Universe, takes this effect into account. Therefore, comoving distance is always larger than the corresponding light travel distance.

In the lead image above from FastSound, the colors of the galaxies indicate their star formation rate, i.e., the total mass of stars produced in a galaxy every year. The gradation in background color represents the number density of galaxies; the underlying mass distribution (which is dominated by invisible dark matter that accounts for about 30% of the total energy in the Universe) and how it would look like this if we could see it. The lower part of the figure shows the relative locations of the FastSound and the Sloan Digital Sky Survey (SDSS) regions, indicating that the FastSound project is mapping a more distant Universe than SDSS’s 3D map of the nearby Universe.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

Find out more about FastSound here.

Source: Subaru Telescope

Article by Nancy Atkinson originally posted on Universe Today

Space Colony Art from the 1970s

Space Colony Art from the 1970s

Three space colony summer studies were conducted at NASA Ames in the 1970s.

These artistic renderings conducted with scientists from NASA’s Ames Research Center in 1975 at Stanford University, they have artificial gravity, small mountains, rivers, trees, designed for populations from 10,000 to one million!

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

Images credit NASA Ames Research Center.

Toroidal Colonies

Population: 10,000

Torus_Model_AC76-0492.1_900

Model of torus colony.

Torus_Cutaway_AC75-1086-1_900

Cutaway view, exposing the interior. Art work: Rick Guidice.

Torus_Interior_AC75-2621_900

Interior view. Art work: Don Davis.

Torus_Construction_AC75-1886_900

Construction along the torus rim. Art work: Don Davis.

Torus_Exterior_AC76-0525_900

Exterior view. Art work: Don Davis.

Bernal Spheres

Population: 10,000. The Bernal Sphere is a point design with a spherical living area.

Bernal_Exterior_AC76-0965_900

Exterior view. Art work: Rick Guidice.

Bernal_Cutaway_AC76-1089_900

Cutaway view of Bernal Sphere. Art work: Rick Guidice.

Bernal_Interior_AC76-0628_900

Interior including human powered flight. Art work: Rick Guidice.

Bernal_Agriculture_AC78-0330-4_900

Agricultural modules in cutaway view (multiple toroids). Art work: Rick Guidice.

Bernal_Construction_AC76-1288_900

Construction crew at work on the colony. Art work: Don Davis.

Bernal_Model_AC76-0852_900

Model of a Bernal Sphere.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

Cylindrical Colonies

Population: Over a million.

Cylinder_Exterior_AC75-1085_900

Exterior view of a double cylinder colony. Art work: Rick Guidice.

Cylinder_Interior_AC75-1086_900

Interior view looking out through large windows. Art work: Rick Guidice.

Cylinder_Endcap_AC75-1883_900

Endcap view with suspension bridge. Art work: Don Davis.

Cylinder_Eclipse_AC75-1920_900

Eclipse of the sun with view of clouds and vegetation. Art work: Don Davis.

Cylinder_Multiple_AC75-1921_900

Multiple two-cylinder colonies aimed toward the sun.

Supercomputer can only simulate 1% of your brain

Supercomputer can only simulate 1% of your brain

Image © RIKEN

Image © RIKEN

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

Researchers from the RIKEN HPCI Program for Computational Life Sciences, by using the full computational power of the Japanese K supercomputer, have carried out the largest general neuronal network simulation to date.

The simulated network only represented 1% of the neuronal network in the brain.

“Using NEST, the team, led by Markus Diesmann in collaboration with Abigail Morrison both now with the Institute of Neuroscience and Medicine at Jülich, succeeded in simulating a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses. To realize this feat, the program recruited 82,944 processors of the K computer.  The process took 40 minutes to complete the simulation of 1 second of neuronal network activity in real, biological, time.”

“If peta-scale computers like the K computer are capable of representing 1% of the network of a human brain today, then we know that simulating the whole brain at the level of the individual nerve cell and its synapses will be possible with exa-scale computers hopefully available within the next decade,” explains Diesmann.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

via cnet

Source: Worldlesstech

The Truth According To Wikipedia

The Truth According To Wikipedia

Google or Wikipedia? Those of us who search online — and who doesn’t? — are getting referred more and more to Wikipedia. For the past two years, this free online “encyclopedia of the people” has been topping the lists of the world’s most popular websites. But do we really know what we’re using? Backlight plunges into the story behind Wikipedia and explores the wonderful world of Web 2.0. Is it a revolution, or pure hype?

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]
Director IJsbrand van Veelen goes looking for the truth behind Wikipedia. Only five people are employed by the company, and all its activities are financed by donations and subsidies. The online encyclopedia that everyone can contribute to and revise is now even bigger than the illustrious Encyclopedia Britannica. Does this spell the end for traditional institutions of knowledge such as Britannica? And should we applaud this development as progress or mourn it as a loss? How reliable is Wikipedia? Do “the people” really hold the lease on wisdom? And since when do we believe that information should be free for all? In this film, “Wikipedians,” the folks who spend their days writing and editing articles, explain how the online encyclopedia works. In addition, the parties involved discuss Wikipedia’s ethics and quality of content. It quickly becomes clear that there are camps of both believers and critics. Wiki’s Truth introduces us to the main players in the debate: Jimmy Wales (founder and head Wikipedian), Larry Sanger (co-founder of Wikipedia, now head of Wiki spin-off Citizendium), Andrew Keen (author of The Cult of the Amateur: How Today’s Internet Is Killing Our Culture and Assaulting Our Economy), Phoebe Ayers (a Wikipedian in California), Ndesanjo Macha (Swahili Wikipedia, digital activist), Tim O’Reilly (CEO of O’Reilly Media, the “inventor” of Web 2.0), Charles Leadbeater (philosopher and author of We Think, about crowdsourcing), and Robert McHenry (former editor-in-chief of Encyclopedia Britannica). Opening is a video by Chris Pirillo.

The questions surrounding Wikipedia lead to a bigger discussion of Web 2.0, a phenomenon in which the user determines the content. Examples include YouTube, MySpace, Facebook, and Wikipedia. These sites would appear to provide new freedom and opportunities for undiscovered talent and unheard voices, but just where does the boundary lie between expert and amateur? Who will survive according to the laws of this new “digital Darwinism”? Are equality and truth really reconcilable ideals? And most importantly, has the Internet brought us wisdom and truth, or is it high time for a cultural counterrevolution?

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

‘Robohope’: First talking humanoid robot launched into space

‘Robohope’: First talking humanoid robot launched into space

A small Japanese robot, Kirobo, that boasts the abilities to talk, recognize voice and emotions, as well as to learn, has been sent to the International Space Station. Kirobo says his mission is a historic attempt to befriend robots and humans.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

World’s fastest comp: China unveils new top-ranking supercomputer

World’s fastest comp: China unveils new top-ranking supercomputer

World’s fastest comp: China unveils new top-ranking supercomputer

The Tianhe-2 supercomputer. (Image from netlib.org)

China has developed a new supercomputer, which is twice as fast as US and Japanese systems, early tests show. The new Tianhe-2 (Milkyway-2) is said to be even speedier in theory, and is likely to top the world ranks for years.

The worldwide supercomputer race may again be led by China, and this time for good – even before the official tests, the Tianhe-2 is showing a stunning performance of 30.7 Petaflops (quadrillions of calculations) per second, while its closest US rival, the Titan, can boast only 17.6 Petaflops.

The supercomputer’s capabilities have been confirmed in a detailed report by Jack Dongarra of the University of Tennessee, who recently visited China’s National University of Defense Technology (NUDT), where the Tianhe-2 is currently being tested.

The powerful system was assembled by Chinese company Inspur using tens of thousands of the latest multicore chips produced by Intel, with an addition of some home-made technology. In total, the supercomputer is said to contain over 3 million processor cores.

The enormous computer will consume up to 24 Megawatts of energy while under load, and a special liquid coolant is being developed to prevent overheating. It will also have access to some 12 Petabytes of storage.

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]

inside

An artist’s rendering of the final look of the Tianhe-2 system. (Image from netlib.org)

Inspur claims the Tianhe-2 is capable of 54 Petaflops of peak theoretical compute performance. Earlier reports said China is aiming for no lesser than a 100 Petaflops machine by 2015.

Such impressive performance rates may be required for designing or modeling complex defense and civil technologies – for instance, testing aircrafts, as is expected in the case of the Tianhe-2.

Economic analysis is another feasible job for the supercomputer.

But the governments are also heavily using supercomputers for gathering intelligence. NUDT has listed aiding in government security as one of the possible uses for the Tianhe-2.

The Tianhe-2 will provide “an open platform for research and education and provide high performance computing service for southern China” after it moves to the Chinese National Supercomputer Center in Guangzhou this year, according to the report.

The system will likely top the biannual Top 500 supercomputer list compiled by a joint US-German group of scientists, in which the report’s author Dongarra participates.

For the US, which has largely dominated the list so far, the new Chinese supercomputer should serve as a “wake-up call,” and an indication the country might fall behind in the high-tech race for years, Dongarra said in an interview on Wednesday.

via RT

[msa-ads data-ad-client=”ca-pub-6965588547261395″ data-ad-slot=”7732882042″]