|
Post by Admin on Sept 13, 2021 16:54:06 GMT
The Journey to Define Dimension The concept of dimension seems simple enough, but mathematicians struggled for centuries to precisely define and understand it. www.quantamagazine.org/a-mathematicians-guided-tour-through-high-dimensions-20210913The notion of dimension at first seems intuitive. Glancing out the window we might see a crow sitting atop a cramped flagpole experiencing zero dimensions, a robin on a telephone wire constrained to one, a pigeon on the ground free to move in two and an eagle in the air enjoying three. But as we’ll see, finding an explicit definition for the concept of dimension and pushing its boundaries has proved exceptionally difficult for mathematicians. It’s taken hundreds of years of thought experiments and imaginative comparisons to arrive at our current rigorous understanding of the concept. The ancients knew that we live in three dimensions. Aristotle wrote, “Of magnitude that which (extends) one way is a line, that which (extends) two ways is a plane, and that which (extends) three ways a body. And there is no magnitude besides these, because the dimensions are all that there are.” Yet mathematicians, among others, have enjoyed the mental exercise of imagining more dimensions. What would a fourth dimension — somehow perpendicular to our three — look like? One popular approach: Suppose our knowable universe is a two-dimensional plane in three-dimensional space. A solid ball hovering above the plane is invisible to us. But if it falls and contacts the plane, a dot appears. As it continues through the plane, a circular disk grows until it reaches its maximum size. It then shrinks and disappears. It is through these cross sections that we see three dimensional shapes. Similarly, in our familiar three-dimensional universe, if a four-dimensional ball were to pass through it would appear as a point, grow into a solid ball, eventually reach its full radius, then shrink and disappear. This gives us a sense of the four-dimensional shape, but there are other ways of thinking about such figures. For example, let’s try visualizing the four-dimensional equivalent of a cube, known as a tesseract, by building up to it. If we begin with a point, we can sweep it in one direction to obtain a line segment. When we sweep the segment in a perpendicular direction, we obtain a square. Dragging this square in a third perpendicular direction yields a cube. Likewise, we obtain a tesseract by sweeping the cube in a fourth direction.
|
|
|
Post by Admin on Oct 1, 2021 17:57:27 GMT
|
|
|
Post by Admin on Oct 1, 2021 20:33:36 GMT
THE MYSTERIOUS TIMELESSNESS OF MATH www.philosophytalk.org/shows/mysterious-timelessness-mathMath is a really useful subject—at least, that's what your parents and teachers told you. But math also leads to scenarios, like Zeno's paradoxes, that seem to inspire skepticism. So why do we believe in math and rely on it to build bridges and spaceships? How can anyone discover the secrets of the universe by simply scribbling numbers on a piece of paper? Is math some kind of magic, or does it have a more ordinary explanation? And could math be culturally relative, or are its concepts timeless and universal? Josh and Ray add things up with Arezoo Islami from SF State University.
|
|
|
Post by Admin on Oct 1, 2021 20:38:29 GMT
How to Embed Trust Into the Foundations of the Internet By Aaron Frank -Sep 19, 2021 singularityhub.com/2021/09/19/how-to-embed-trust-into-the-foundations-of-the-internet/Earlier this year, a digital artist conned unsuspecting NFT collectors to highlight a vulnerability in the way cryptographically secured assets are managed online. The anonymous artist, known by their twitter handle @neitherconfirm, sold a collection of stylized portraits as NFTs, but once sold, immediately changed the image file associated with the token to photos of rugs. And not even originals—just watermarked pictures of ugly carpets. The symbolism wasn’t lost on the crypto community, where “rug pulls” are a well-known scam in which unsuspecting traders are left holding worthless cryptocurrency. The mostly harmless prank by @neitherconfirm calls attention to the way some NFT file storage relies on centralized mechanisms through which single individuals can still manipulate the data associated with an NFT. Similarly, if a digital marketplace hosting and minting NFTs with centralized addresses later disappears, those NFTs may become worthless. Some collectors buying NFT tweets, for example, learned the hard way that if a tokenized tweet is deleted, they become the proud owner of an NFT pointing to nothing at all. It’s understandable if you’re skeptical of NFT collecting and wonder whether it’s merely a casino for the crypto elite. Why care about the challenges of NFT data management? Behind the hype, there could be something substantial taking shape. The protocol now widely used to mitigate these issues, called the Interplanetary File System (or IPFS for short), has broader applications, and could fundamentally reshape how all data is managed across the web. When I recently spoke to Molly Mackinlay, who leads product and engineering at Protocol Labs—a company overseeing the development of IPFS—she suggested the protocol may affect a range of significant sociopolitical systems. IPFS-enabled file preservation and data authentication could impact judicial systems, historical archiving in the digital world, and even bolster the fight against “fake news” and misinformation when trust in journalism is declining. During our conversation, Mackinlay said today’s internet architecture requires us to trust centralized intermediaries (and those with access to them) not to quietly change online information like news articles, scientific data sets, or images associated with an NFT. But as we’ve seen, the internet is always changing in both obvious and subtle ways. The early days of the Covid-19 pandemic in the US offers a relevant case in point, when the Trump administration ordered hospitals to send patient data to centralized databases in Washington, bypassing the CDC which traditionally had received such data. The unusual move prompted fears that the data might be altered in politicized ways which could undermine research efforts. There were similar fears that the EPA might modify climate data. “Understanding which version of a data file you’re accessing should be built directly into data on the internet, and if you need to reference a specific version of an article, image, or scientific data set, you should know if the thing you’re getting back has been altered,” Mackinlay said. As a protocol, IPFS could yield a more dependable archive for our ephemeral internet. “Ultimately, what we’re talking about is technologically embedding trust right into the protocol itself,” Mackinlay said. To understand the implications of IPFS, Mackinlay framed it within the development of Web3, a significant shift in the way we design the internet. At its core, Web3 is the return to a decentralized internet. An online world that’s less reliant on centralized institutions and has layers of authentication embedded directly into its architecture. In a 2018 talk, Protocol Labs CEO, Juan Benet, suggested Web3 could do for internet services and applications what Bitcoin hopes to do for money—remove centralized intermediaries while preserving trust. At its heart, IPFS is a peer-to-peer data storage system (not unlike the original Napster or BitTorrent). Instead of storing files on a central server, data is distributed across a network of participants incentivized to host and verify the legitimacy of the data. Those willing to offer unused hard drive space to store IPFS “objects” are rewarded by a complementary system called Filecoin, a blockchain that oversees payments to those storing files and data. Another critical aspect of IPFS relates to something called addressing. Addressing is how internet users access content online. Protocol Labs hopes to swap out the widespread use of location-based addressing with something called “content-addressing.” With location addressing, URLs and domain names point to a specific place where an image file or news article is hosted, and it doesn’t matter to the URL what content is stored there. It can be an NFT of a portrait one day and an ugly carpet the next. Content-based addressing, by contrast, manages data by confirming and verifying what the file is rather than where it’s located. Every piece of data on the IPFS network is stored as an “object” and given a unique hash (a sort of digital fingerprint). When someone enters an IPFS web address, they are asking the network to show them a file associated with the specific hash entered. And because the data cannot be changed without also changing the hash associated with it, the user can trust that the file returned by the network contains the legitimate data they requested. For NFT collectors purchasing a file created with IPFS, they can be sure that the NFT is associated with a piece of content that cannot be changed. Here is an example of an NFT’s metadata (which an NFT collector would “own” the record of), and here is the NFT image itself. Beyond securing the long-term value of NFTs, a range of organizations are using IPFS, including Project Starling, a joint venture from Reuters, Stanford, and USC aiming to boost trust in news media. During Reuters’ coverage of the 2020 US election, photojournalists were given devices that used IPFS to create hashes for photographs and then upload them to Filecoin’s decentralized storage network. In this way, the authenticity of the image was preserved at the point of capture. The hope is that IPFS will make manipulating news images increasingly difficult in the future. It’s worth noting that IPFS is as much a community of independent node operators as it is a core technology. It’s not clear, for example, how decentralized (and takedown-resistant) file hosting will deal with the inevitable challenges of copyright issues and other more objectionable content. In these cases, Mackinlay pointed out that the burden of responsibility to comply with local and federal laws shifts to individuals participating as nodes on the network, and pointed toward the beginnings of a content moderation mechanism being designed for a decentralized web. Given the social challenges that centralized platform companies have had in moderating content online, the idea of a public, transparent, and community driven moderation process could even be a welcome change. At a minimum, it’s clearly recognized by the Web3 community as an issue to focus on. “IPFS is designed around the belief that no one person or company should have unilateral control over all available content on the internet. No node should be forced to host content they don’t want to, and vice versa, no central node controls what the entire network of independent nodes can and can’t host,” Mackinlay said. As decentralized architecture works to replace the centralized systems of today’s internet, IPFS could grow to power many of the coming products and services at the heart of Web3. While it’s not yet clear whether unfettered decentralization is good for every aspect of our online lives, it’s likely that it will be useful for many things. And for those use cases, IPFS should prove to be a solid protocol for the online world of tomorrow.
|
|
|
Post by Admin on Oct 1, 2021 20:45:49 GMT
History of the center of the Universe en.wikipedia.org/wiki/History_of_the_center_of_the_UniverseThe center of the Universe is a concept that lacks a coherent definition in modern astronomy; according to standard cosmological theories on the shape of the universe, it has no center. Historically, different people have suggested various locations as the center of the Universe. Many mythological cosmologies included an axis mundi, the central axis of a flat Earth that connects the Earth, heavens, and other realms together. In the 4th century BC Greece, philosophers developed the geocentric model, based on astronomical observation; this model proposed that the center of the Universe lies at the center of a spherical, stationary Earth, around which the Sun, Moon, planets, and stars rotate. With the development of the heliocentric model by Nicolaus Copernicus in the 16th century, the Sun was believed to be the center of the Universe, with the planets (including Earth) and stars orbiting it. In the early-20th century, the discovery of other galaxies and the development of the Big Bang theory led to the development of cosmological models of a homogeneous, isotropic Universe, which lacks a central point and is expanding at all points. Multiverse en.wikipedia.org/wiki/MultiverseThe multiverse is a hypothetical group of multiple universes. Together, these universes comprise everything that exists: the entirety of space, time, matter, energy, information, and the physical laws and constants that describe them. The different universes within the multiverse are called "parallel universes", "other universes", "alternate universes", or "many worlds".
|
|
|
Post by Admin on Oct 14, 2021 16:42:44 GMT
|
|
|
Post by Admin on Oct 15, 2021 14:53:37 GMT
Behold the Megatron: Microsoft and Nvidia build massive language processor MT-NLG is a beast that fed on over 4,000 GPUs Katyanna Quach Tue 12 Oct 2021 // 00:36 UTC www.theregister.com/2021/10/12/nvidia_microsoft_mtnlg/Nvidia and Microsoft announced their largest monolithic transformer language model to date, an AI model with a whopping 530 billion parameters they developed together, named the Megatron-Turing Natural Language Generation model. MT-NLG is more powerful than previous transformer-based systems trained by both companies, namely Microsoft’s Turing-NLG model and Nvidia’s Megatron-LM. Made up of three times more parameters spread across 105 layers, MT-NLG is much larger and more complex. For comparison, OpenAI’s GPT-3 model has 175 billion parameters and Google’s Switch Transformer demo has 1.6 trillion parameters. Bigger is generally better when it comes to neural networks. It requires them to ingest more training data. MT-NLG is better at a wide variety of natural language tasks such as auto-completing sentences, question and answering, and reading and reasoning compared to its predecessors. It can also perform these tasks with little to no fine-tuning, something referred to as few-shot or zero-shot learning. As these language models become larger, AI researchers and engineers need to come up with all sorts of techniques and tricks to train them. It requires careful coordination: the model and its training data have to be stored and processed across numerous chips at the same time. MLT-NLG was trained using Nvidia’s Selene machine learning supercomputer, a system made up of 560 DGX A100 servers with each server containing eight A100 80GB GPUs. Selene is also powered by AMD’s EPYC 7v742 CPU processors and is estimated to cost over $85m, according to The Next Platform. All 4,480 GPUs use NvLink and NVSwitch to connect to one another. Each one was capable of operating over 113 teraFLOPs per second. It’s incredibly expensive to train these models and even if they’re running on top-of-the-range hardware, it requires software hacks to reduce training times. Nvidia and Microsoft used DeepSpeed, a deep learning library containing PyTorch code that allowed engineers to cram more data across numerous pipelines in parallel to scale up Megatron-LM. In all, 1.5TB of data was processed to train the model in a process that took a little over a month. “By combining tensor-slicing and pipeline parallelism, we can operate them within the regime where they are most effective,” Paresh Kharya, senior director of product management and marketing for accelerated computing at Nvidia , and Ali Alvi, group program manager for the Microsoft Turing team, explained in a blog post. “More specifically, the system uses tensor-slicing from Megatron-LM to scale the model within a node and uses pipeline parallelism from DeepSpeed to scale the model across nodes. “For example, for the 530 billion model, each model replica spans 280 Nvidia A100 GPUs, with 8-way tensor-slicing within a node and 35-way pipeline parallelism across nodes. We then use data parallelism from DeepSpeed to scale out further to thousands of GPUs.”
|
|
|
Post by Admin on Oct 16, 2021 18:23:14 GMT
Wild New Paper Claims Earth May Be Surrounded by a Giant Magnetic Tunnel MICHELLE STARR15 OCTOBER 2021 www.sciencealert.com/earth-may-be-surrounded-by-a-giant-magnetic-tunnelMysterious structures in the sky that have puzzled astronomers for decades might finally have an explanation – and it's quite something. The North Polar Spur and the Fan Region, on opposite sides of the sky, may be connected by a vast system of magnetized filaments. These form a structure resembling a tunnel that circles the Solar System, and many nearby stars besides. "If we were to look up in the sky," said astronomer Jennifer West of the University of Toronto in Canada, "we would see this tunnel-like structure in just about every direction we looked – that is, if we had eyes that could see radio light." We've known about the two structures for quite some time – since the 1960s, in fact – but they have been difficult to understand. That's because it's really hard to work out exactly how far away they are; distances have ranged from hundreds to thousands of light-years away. However, no analysis had ever linked the two structures together. West and her colleagues were able to show that the two regions, and prominent radio loops in the space between them, could be linked, solving many of the puzzling problems associated with both. "A few years ago, one of our co-authors, Tom Landecker, told me about a paper from 1965, from the early days of radio astronomy. Based on the crude data available at this time, the authors (Mathewson & Milne), speculated that these polarized radio signals could arise from our view of the Local Arm of the galaxy, from inside it," West explained. "That paper inspired me to develop this idea and tie my model to the vastly better data that our telescopes give us today." Using modelling and simulations, the researchers figured out what the radio sky would look like, if the two structures were connected by magnetic filaments, playing with parameters such as distance to determine the best fit. From this, the team was able to determine that the most likely distance for the structures from the Solar System is around 350 light-years, consistent with some of the closer estimates. This includes an estimate for the distance of the North Polar Spur earlier this year based on Gaia data, which found that almost all of the spur is within 500 light-years. The entire length of the tunnel modelled by West and her team is around 1,000 light-years.
|
|
|
Post by Admin on Oct 18, 2021 14:55:24 GMT
The most powerful space telescope ever built will look back in time to the Dark Ages of the universe October 12, 2021 8.31pm BST theconversation.com/the-most-powerful-space-telescope-ever-built-will-look-back-in-time-to-the-dark-ages-of-the-universe-169603Some have called NASA’s James Webb Space Telescope the “telescope that ate astronomy.” It is the most powerful space telescope ever built and a complex piece of mechanical origami that has pushed the limits of human engineering. On Dec. 18, 2021, after years of delays and billions of dollars in cost overruns, the telescope is scheduled to launch into orbit and usher in the next era of astronomy. I’m an astronomer with a specialty in observational cosmology – I’ve been studying distant galaxies for 30 years. Some of the biggest unanswered questions about the universe relate to its early years just after the Big Bang. When did the first stars and galaxies form? Which came first, and why? I am incredibly excited that astronomers may soon uncover the story of how galaxies started because James Webb was built specifically to answer these very questions.
|
|
|
Post by Admin on Oct 19, 2021 16:07:11 GMT
Strange radio waves emerge from the direction of the galactic center A variable signal aligned to the heart of the Milky Way is tantalising scientists Date: October 12, 2021 Source: University of Sydney Summary: Astronomers have detected a very unusual variable radio signal from towards the heart of the Milky Way, which is now tantalizing scientists. www.sciencedaily.com/releases/2021/10/211012080039.htmAstronomers have discovered unusual signals coming from the direction of the Milky Way's centre. The radio waves fit no currently understood pattern of variable radio source and could suggest a new class of stellar object. "The strangest property of this new signal is that it is has a very high polarisation. This means its light oscillates in only one direction, but that direction rotates with time," said Ziteng Wang, lead author of the new study and a PhD student in the School of Physics at the University of Sydney. "The brightness of the object also varies dramatically, by a factor of 100, and the signal switches on and off apparently at random. We've never seen anything like it." Many types of star emit variable light across the electromagnetic spectrum. With tremendous advances in radio astronomy, the study of variable or transient objects in radio waves is a huge field of study helping us to reveal the secrets of the Universe. Pulsars, supernovae, flaring stars and fast radio bursts are all types of astronomical objects whose brightness varies. "At first we thought it could be a pulsar -- a very dense type of spinning dead star -- or else a type of star that emits huge solar flares. But the signals from this new source don't match what we expect from these types of celestial objects," Mr Wang said. The discovery of the object has been published today in the Astrophysical Journal. Mr Wang and an international team, including scientists from Australia's national science agency CSIRO, Germany, the United States, Canada, South Africa, Spain and France discovered the object using the CSIRO's ASKAP radio telescope in Western Australia. Follow-up observations were with the South African Radio Astronomy Observatory's MeerKAT telescope. Mr Wang's PhD supervisor is Professor Tara Murphy also from the Sydney Institute for Astronomy and the School of Physics. Professor Murphy said: "We have been surveying the sky with ASKAP to find unusual new objects with a project known as Variables and Slow Transients (VAST), throughout 2020 and 2021. "Looking towards the centre of the Galaxy, we found ASKAP J173608.2-321635, named after its coordinates. This object was unique in that it started out invisible, became bright, faded away and then reappeared. This behaviour was extraordinary." After detecting six radio signals from the source over nine months in 2020, the astronomers tried to find the object in visual light. They found nothing. They turned to the Parkes radio telescope and again failed to detect the source. Professor Murphy said: "We then tried the more sensitive MeerKAT radio telescope in South Africa. Because the signal was intermittent, we observed it for 15 minutes every few weeks, hoping that we would see it again. "Luckily, the signal returned, but we found that the behaviour of the source was dramatically different -- the source disappeared in a single day, even though it had lasted for weeks in our previous ASKAP observations." However, this further discovery did not reveal much more about the secrets of this transient radio source. Mr Wang's co-supervisor, Professor David Kaplan from the University of Wisconsin-Milwaukee, said: "The information we do have has some parallels with another emerging class of mysterious objects known as Galactic Centre Radio Transients, including one dubbed the 'cosmic burper'. "While our new object, ASKAP J173608.2-321635, does share some properties with GCRTs there are also differences. And we don't really understand those sources, anyway, so this adds to the mystery." The scientists plan to keep a close eye on the object to look for more clues as to what it might be. "Within the next decade, the transcontinental Square Kilometre Array (SKA) radio telescope will come online. It will be able to make sensitive maps of the sky every day," Professor Murphy said. "We expect the power of this telescope will help us solve mysteries such as this latest discovery, but it will also open vast new swathes of the cosmos to exploration in the radio spectrum."
|
|
|
Post by Admin on Oct 23, 2021 15:33:54 GMT
|
|
|
Post by Admin on Oct 25, 2021 9:20:09 GMT
New nuclear fusion reactor design may be a breakthrough Using permanent magnets may help to make nuclear fusion reactors simpler and more affordable. bigthink.com/hard-science/nuclear-fusion-reactor/#Echobox=1634877974The promise of nuclear fusion is tantalizing: By utilizing the same atomic process that powers our sun, we may someday be able to generate virtually unlimited amounts of clean energy. But while fusion reactors have been around since the 1950s, scientists haven’t been able to create designs that can produce energy in a sustainable manner. Standing in the way of nuclear fusion are politics, lack of funding, concerns about the power source, and potentially insurmountable technological problems, to name a few roadblocks. Today, the nuclear fusion reactors we have are stuck at the prototype stage. However, researcher Michael Zarnstorff in New Jersey may have recently made a significant breakthrough while helping his son with a science project. In a new paper, Zarnstorff, a chief scientist at the Max Planck Princeton Research Center for Plasma Physics in New Jersey, and his colleagues describe a simpler design for a stellarator, one of the most promising types of nuclear fusion reactors. Fusion reactors generate power by smashing together, or fusing, two atomic nuclei to produce one or more heavier nuclei. This process can unleash vast amounts of energy. But achieving fusion is difficult. It requires heating hydrogen plasma to over 100,000,000°C, until the hydrogen nuclei fuse and generate energy. Unsurprisingly, this super-hot plasma is hard to work with, and it can damage and corrode the expensive hardware of the reactor.
|
|
|
Post by Admin on Oct 26, 2021 17:00:14 GMT
Scientists create single-atom devices to supercharge computers Researchers devise groundbreaking new methods to create and duplicate single-atom transistors for quantum computers. bigthink.com/hard-science/scientists-create-single-atom-devices-to-supercharge-computers/#Echobox=1635050458Researchers from the National Institute of Standards and Technology (NIST) and the University of Maryland were able to create single-atom transistors for only the second time ever. They also achieved an unprecedented quantum mechanics feat, allowing for the future development of computers. The tiny devices could be crucial in creating qubits, leading to next-generation technology. Tiny technologies could have tremendous effects on the next generation of computers, supercharging memory and processing abilities. Key to these advancements would be the creation of transistors that are the size of several or even single atoms. Newly-published research from scientists at the National Institute of Standards and Technology (NIST) and the University of Maryland provides a blueprint on how to create such microscopic tech. A big challenge to this endeavor is in figuring out how to duplicate such small transistors, which would act like small on-off switches, reports Science News. Utilizing the recipe they devised, the team led by NIST became just the second ever to create a single-atom transistor and the first ever to produce a series of transistors with only a single electron each, whose geometry could be manipulated at the atomic level. The scientists were also able to gain control over the quantum phenomenon of quantum tunneling, changing the rate at which individual electrons travelled through a physical gap or the transistor’s electrical barrier. The significance of managing this process lies in allowing the transistors to get “entangled” according to the laws of quantum mechanics. This can lead to new ways of creating quantum bits (qubits) – the basic unit of information in quantum computing.
|
|
|
Post by Admin on Nov 1, 2021 12:29:15 GMT
|
|
|
Post by Admin on Nov 3, 2021 17:51:45 GMT
NOVEMBER 2, 2021 Gravitational 'kick' may explain the strange shape at the center of Andromeda by Daniel Strain, University of Colorado at Boulder phys.org/news/2021-11-gravitational-strange-center-andromeda.htmlWhen two galaxies collide, the supermassive black holes at their cores release a devastating gravitational "kick," similar to the recoil from a shotgun. New research led by CU Boulder suggests that this kick may be so powerful it can knock millions of stars into wonky orbits. The research, published Oct. 29 in The Astrophysical Journal Letters, helps solve a decades-old mystery surrounding a strangely-shaped cluster of stars at the heart of the Andromeda Galaxy. It might also help researchers better understand the process of how galaxies grow by feeding on each other. "When scientists first looked at Andromeda, they were expecting to see a supermassive black hole surrounded by a relatively symmetric cluster of stars," said Ann-Marie Madigan, a fellow of JILA, a joint research institute between CU Boulder and the National Institute of Standards and Technology (NIST). "Instead, they found this huge, elongated mass." Now, she and her colleagues think they have an explanation. In the 1970s, scientists launched balloons high into Earth's atmosphere to take a close look in ultraviolet light at Andromeda, the galaxy nearest to the Milky Way. The Hubble Space Telescope followed up on those initial observations in the 1990s and delivered a surprising finding: Like our own galaxy, Andromeda is shaped like a giant spiral. But the area rich in stars near that spiral's center doesn't look like it should––the orbits of these stars take on an odd, ovalish shape like someone stretched out a wad of Silly Putty. And no one knew why, said Madigan, also an assistant professor of astrophysics. Scientists call the pattern an "eccentric nuclear disk." In the new study, the team used computer simulations to track what happens when two supermassive black holes go crashing together––Andromeda likely formed during a similar merger billions of years ago. Based on the team's calculations, the force generated by such a merger could bend and pull the orbits of stars near a galactic center, creating that telltale elongated pattern. "When galaxies merge, their supermassive black holes are going to come together and eventually become a single black hole," said Tatsuya Akiba, lead author of the study and a graduate student in astrophysics. "We wanted to know: What are the consequences of that?" Bending space and time He added that the team's findings help to reveal some of the forces that may be driving the diversity of the estimated two trillion galaxies in the universe today––some of which look a lot like the spiral-shaped Milky Way, while others look more like footballs or irregular blobs. Mergers may play an important role in shaping these masses of stars: When galaxies collide, Akiba said, the black holes at the centers may begin to spin around each other, moving faster and faster until they eventually slam together. In the process, they release huge pulses of "gravitational waves," or literal ripples in the fabric of space and time. "Those gravitational waves will carry momentum away from the remaining black hole, and you get a recoil, like the recoil of a gun," Akiba said. He and Madigan wanted to know what such a recoil could do to the stars within 1 parsec, or roughly 19 trillion miles, of a galaxy's center. Andromeda, which can be seen with the naked eye from Earth, stretches tens of thousands of parsecs from end to end. It gets pretty wild. Galactic recoil The duo used computers to build models of fake galactic centers containing hundreds of stars––then kicked the central black hole to simulate the recoil from gravitational waves. Madigan explained the gravitational waves produced by this kind of disastrous collision won't affect the stars in a galaxy directly. But the recoil will throw the remaining supermassive black hole back through space––at speeds that can reach millions of miles per hour, not bad for a body with a mass millions or billions of times greater than that of Earth's sun. "If you're a supermassive black hole, and you start moving at thousands of kilometers per second, you can actually escape the galaxy you're living in," Madigan said. When black holes don't escape, however, the team discovered they may pull on the orbits of the stars right around them, causing those orbits to stretch out. The result winds up looking a lot like the shape scientists see at the center of Andromeda. Madigan and Akiba said they want to grow their simulations so they can directly compare their computer results to that real-life galaxy core––which contains many times more stars. They noted their findings might also help scientists to understand the unusual happenings around other objects in the universe, such as planets orbiting mysterious bodies called neutron stars. "This idea––if you're in orbit around a central object and that object suddenly flies off––can be scaled down to examine lots of different systems," Madigan said.
|
|