14.4 C
Brussels
Friday, October 4, 2024
Home Blog Page 150

IT Recruitment Poland – an Effective Way to find IT Developers in Poland

0
IT Recruitment Poland – an Effective Way to find IT Developers in Poland


If you are looking or have been looking for IT workers, you know very well that it is not at all easy to find trustworthy professionals. We live in unstable times, constant insecurity results in business failures. That’s why we so badly need reliable colleagues and effective solutions. One way to succeed is to work with experienced recruitment specialists. The IT recruitment Poland service offered by the Sowelo agency is just the key to success.

IT recruitment Poland is a fast way to increase the number of skilled IT employees in your company. Image credit: Pexels, free license

Sowelo has been in business for more than 15 years. It has dozens of successful recruitment projects to its credit. Sowelo headhunters work with both Polish and international clients. They recruit for various positions of specialists representing different backgrounds, cultures and languages.

When looking for highly qualified employees, it is worth taking advantage of the experience of professional headhunters representing independent agencies. With their help, you can not only quickly find professionals for a given position, but also get good results in your business much faster. Sowelo offers tailor-made recruitment service perfectly adapted to the expectations of individual clients. Unlike in-house HR specialists, Sowelo employees have recruitment expertise gained from working with clients from home and abroad, so their offer and assistance is much more extensive.

IT Recruitment Poland by Sowelo – minimizing risks

Sowelo Agency has been on the market since 2007. Throughout its history, it has experienced many different situations in the recruitment market. With experience in various circumstances, the consultants have served many clients both at home and abroad. Sowelo will ideally adapt its activities and offerings to changing market conditions, and having the right know-how has a significant advantage over in-house HR staff.

Speaking of projects completed by Sowelo so far, it is worth mentioning just a few of them. Mention Branch Manager, Chief Accountant, Operation Director, Sales Executive, Team Supervisor or Unit Manager, PMO Director, System Architect and so on. High-quality specialists representing various industries and coming from different countries have been found. Sowelo’s clients are very satisfied with the cooperation with the agency and are very eager to return with further projects, as well as recommend cooperation to other companies. Sowelo offers services such as IT Contracting, Recruitment Process Outsourcing, as well as Talent Market Mapping and Employer Branding.

Poland - associative photo.

Poland – associative photo. Image credit: Unsplash, free license

Working with a recruitment agency means minimizing the risk of unsuccessful recruitment. Together with the client, consultants create a profile of the ideal candidate and develop a list of requirements that he or she must meet. Carefully selected candidates with the required skills and competencies are invited to a job interview. Sowelo employees are able to quickly find candidates even for very niche positions, they are also able to reach out to those who are not currently looking for a job and get them interested in the offer.

Cooperation with Sowelo Consulting agency is also, as already mentioned, an opportunity to avoid the serious consequences that bad hires bring. After all, hiring the wrong people can not only cost a lot, but also drag down a company and lead to its bankruptcy.

IT Recruitment Poland by Sowelo – professional knowledge and up-to-date information from the labor market

Sowelo’s expertise and experience is something that anyone can benefit from at any time. IT Recruitment Poland is a service designed for clients both at the early stages of their operation and for companies that have been on the market for some time. The advice and assistance of a good, experienced recruiter not only saves a lot of money, but also saves a lot of time. You can avoid the consequences of bad hires, but also gain the time you need to run your business and make your company more competitive.

Sowelo consultants have contacts that, unfortunately, in-house HR staff do not have. This allows them to quickly contact the best candidates for a given, even the most niche position. This is also important because often high-level professionals are reluctant to contact corporate HR staff, do not trust them and are not willing to cooperate. They are more likely to join independent headhunters and are much more accommodating to the job offers they offer.

Independent headhunters can find truly hidden, rare talent. It is not a problem for them to find specialists representing very rare specializations, because they deal with such cases on a daily basis. As part of the IT Recruitment Poland service, they will quickly find specialists with unique skills or using very rare programming languages.

The key role of recruiters in the IT recruitment Poland process

By using the services of the Sowelo Consulting agency, many of the problems that the recruitment process brings can be avoided. As statistics show, as many as 89% of recruitment failures are due to poor attitudes of new hires. Therefore, experienced recruiters who can read the subtle signs and body language of candidates can really be an invaluable aid in the recruitment process. By seeing arrogant, disrespectful behaviour, negative comments regarding former employers, a Sowelo recruiter is able to avoid hiring the wrong person.

After all, it is well known that an inadequate employee is not only a waste of money, but also a danger of passing on confidential business information to competitors, a negative impact on the company’s image and a waste of valuable time. It is therefore a good idea to use the services of an agency that already has experience in its field and is aware of the risk of various kinds of failures.

Sowelo headhunters are very effective and achieve great results in their work. Their in-depth knowledge gained over years, their willingness to help and their commitment to the project are what clients really appreciate and what they come back to when proposing to collaborate on new projects. Therefore, whenever you are in need of professionals to provide recruitment services to your company, feel free to contact Sowelo team and hire developers and qualified candidates to your team.



Source link

Next-Gen Hardware Trends: A Glimpse into Future Possibilities

0
Next-Gen Hardware Trends: A Glimpse into Future Possibilities


The race to introduce the newest hardware generation is an ongoing effort in the ever-changing world of technology. Hardware is about to enter a transformative era that promises to change how we interact with our devices, from faster CPUs to more immersive screens. As we gaze into the innovation crystal ball, several enticing trends show up, providing a glimpse into the fascinating possibilities.

Working with computer hardware – illustrative photo. Image credit: Jeshoots via Unsplash, free license

Quantum computing

The limits of conventional computing are broken when you enter the world of quantum computing. Quantum hardware has the potential to change fields like drugs and encryption by doing intricate calculations at rates that were previously unthinkable. The time is approaching when quantum computers are more widely available promising breakthroughs that could reshape entire industries.

Processors Integrated with AI

Future hardware should have intelligence as well as brute power. AI-integrated CPUs are opening the door for gadgets that can quickly learn, adapt, and change. Imagine a smartphone that learns from your usage patterns or a self-driving vehicle that becomes better at what it does with each trip. The boundaries between human and machine capabilities will melt as AI and hardware become increasingly entwined, offering us a world of possibilities that we have only just begun to imagine.

Extra-dimensionality

The field of extended reality, where the physical and digital worlds smoothly converge, is likewise driven by hardware trends. Both augmented reality (AR) and virtual reality (VR) are gaining popularity because they provide immersive experiences for applications in entertainment, learning, and the workplace. In addition to tremendous performance, the next-generation gear needed to power these experiences must also provide an unmatched level of realism.

Edge Computing

With the rise of edge computing, data processing is moving away from being centralized and toward the edge of networks, where it is most needed. Real-time applications, from IoT devices to autonomous systems, could benefit from this trend. Edge computing’s hardware breakthroughs will reshape how we process, analyze, and act on data, resulting in faster response times and lower latency.

Sustainable Technology

The next-generation hardware market also emphasizes sustainability as people grow more aware of their environmental impact. Future electronics will be made with the environment in mind, from recyclable materials to energy-efficient processors. This trend demonstrates how innovation may lead to good change in addition to being in line with the goals of global sustainability.

Beyond 5G

The deployment of 5G networks has ushered in a new era of connectivity, but the hardware trends continue. The development of 6G and beyond offers the potential for significantly faster data speeds, incredibly low latency, and the simultaneous connection of a huge number of devices. This progression will make innovations like real-time remote surgery, sophisticated smart cities, and seamless international connectivity possible.

 Biometric Hardware

Biometric technologies, which use our distinctive physical characteristics as the key to unlock gadgets and experiences, are also a part of the hardware future. Unprecedented levels of protection and customization can be achieved using facial recognition, fingerprint sensors, and pulse patterns. These technologies will change how we access information and engage with the digital world as they develop.

Neuromorphic Computing

The exciting idea of neuromorphic computing, which aspires to develop hardware that can resemble the human brain’s neural networks, was inspired by the brain’s structure. With the ability to digest information more like a person, this technology can potentially revolutionize artificial intelligence. The potential for pattern recognition, cognitive computing, and problem-solving is endless.

Holographic Displays

Holographic displays are the upcoming hardware trend, so say goodbye to flat screens. These screens provide real 3-D images that may be seen from all angles without special glasses. Holographic displays are poised to revolutionize our engagement with digital material, from immersive gaming experiences to lifelike virtual shopping.

Wearable Ecosystems

Beyond smartwatches and fitness trackers, wearable technology is developing. The following generation of wearables will build connected ecosystems that meld into our daily lives. We can stay connected, check our health, and interact with the digital world in ways previously only possible in science fiction thanks to smart clothing, augmented reality glasses, and implantable gadgets.

Hyper-Connected Homes

With more devices and sensors added to our living areas, the smart home revolution is expected to pick up speed. The next-generation hardware in our homes will create an environment that adjusts to our preferences and needs, making daily life more practical and effective. Examples include AI-powered kitchen equipment and self-learning thermostats.

Conclusion

The only thing constant in the world of technology is change, and the hardware trends just around the corner herald an innovative and exciting future. Using a signature generator to create an interesting signature might improve your brand. You can demonstrate your authority and credibility when discussing next-gen hardware trends by incorporating your name, title, and a short slogan.

The convergence of quantum computing, AI integration, extended reality, edge computing, signature generators, and sustainable technology promises to reshape our world profoundly. The resulting landscape will be one where technology is not just a tool, but an integral part of our lives, enhancing our capabilities, expanding our horizons, and shaping a more connected, intelligent, and exciting world than we ever imagined. This is because each trend is converging and influencing the others.



Source link

Solar Orbiter closes in on the solution to a 65-year-old solar mystery

0
Solar Orbiter closes in on the solution to a 65-year-old solar mystery


The Sun’s atmosphere is called the corona. It consists of an electrically charged gas known as plasma with a temperature of around one million degrees Celsius.

Its temperature is an enduring mystery because the Sun’s surface is only around 6000 degrees. The corona should be cooler than the surface because the Sun’s energy comes from the nuclear furnace in its core, and things naturally get cooler the further away they are from a heat source. Yet the corona is more than 150 times hotter than the surface.

Another method for transferring energy into the plasma must be at work, but what?

It has long been suspected that turbulence in the solar atmosphere could result in significant heating of the plasma in the corona. But when it comes to investigating this phenomenon, solar physicists run into a practical problem: it is impossible to gather all the data they need with just one spacecraft.

There are two ways to investigate the Sun: remote sensing and in-situ measurements. In remote sensing, the spacecraft is positioned far away and uses cameras to look at the Sun and its atmosphere in different wavelengths. For in-situ measurements, the spacecraft flies through the region it wants to investigate and takes measurements of the particles and magnetic fields in that part of space.

Both approaches have their advantages. Remote sensing shows the large-scale results but not the details of the processes happening in the plasma. Meanwhile, in-situ measurements give highly specific information about the small-scale processes in the plasma but do not show how this affects the large scale.

To get the full picture, two spacecraft are needed. This is exactly what solar physicists currently have in the form of the ESA-led Solar Orbiter spacecraft, and NASA’s Parker Solar Probe. Solar Orbiter is designed to get as close to the Sun as it can and still perform remote sensing operations, along with in-situ measurements. Parker Solar Probe largely forgoes remote sensing of the Sun itself to get even closer for its in-situ measurements.

But to take full advantage of their complementary approaches, Parker Solar Probe would have to be within the field of view of one of Solar Orbiter’s instruments. That way Solar Orbiter could record the large-scale consequences of what Parker Solar Probe was measuring in situ.

Daniele Telloni, researcher at the Italian National Institute for Astrophysics (INAF) at the Astrophysical Observatory of Torino, is part of the team behind Solar Orbiter’s Metis instrument. Metis is a coronagraph that blocks out the light from the Sun’s surface and takes pictures of the corona. It is the perfect instrument to use for the large-scale measurements and so Daniele began looking for times when Parker Solar Probe would line up.

He found that on 1 June 2022, the two spacecraft would almost be in the correct orbital configuration. Essentially, Solar Orbiter would be looking at the Sun and Parker Solar Probe would be just off to the side, tantalisingly close but just out of the field of view of the Metis instrument.

As Daniele looked at the problem, he realised all it would take to bring Parker Solar Probe into view was a little bit of gymnastics with Solar Orbiter: a 45 degree roll and then pointing it slightly away from the Sun.

But when every manoeuvre of a space mission is carefully planned in advance, and spacecraft are themselves designed to point only in very specific directions, especially when coping with the fearsome heat of the Sun, it was not clear that the spacecraft operations team would authorise such a deviation. However, once everyone was clear on the potential scientific return, the decision was a clear ‘yes’.

The roll and the offset pointing went ahead; Parker Solar Probe came into the field of view, and together the spacecraft produced the first ever simultaneous measurements of the large scale configuration of the solar corona and the microphysical properties of the plasma.

Artist impression of Solar Orbiter and Parker Solar Probe

“This work is the result of contributions from many, many people,” says Daniele, who led the analysis of the data sets. They made the first combined observational and in-situ estimate of the coronal heating rate.

“The ability to use both Solar Orbiter and Parker Solar Probe has really opened up an entirely new dimension in this research,” says Gary Zank, University of Alabama in Huntsville, USA, and a co-author on the resulting paper.

By comparing the newly measured rate to the theoretical predictions made by solar physicists over the years, Daniele has shown that solar physicists were almost certainly right in identifying turbulence as a way of transferring energy.

The specific way turbulence does this is not dissimilar to when you stir your morning cup of coffee. By stimulating random movements of a fluid, either a gas or a liquid, energy is transferred to ever smaller scales, which culminates in energy transformation into heat. In the case of the solar corona, the fluid is also magnetized, so stored magnetic energy is also available to be converted into heat.

Such a transfer of magnetic and movement energy from larger to smaller scales is the very essence of turbulence. At the smallest scales, it allows the fluctuations to finally interact with individual particles, mostly protons, and heat them up.

More work is needed before we can say that the solar heating problem is solved but now, thanks to Daniele’s work, solar physicists have their first measurement of this process.

“This is a scientific first. This work represents a significant step forward in solving the coronal heating problem,” says Daniel Müller, Project Scientist.

Source: European Space Agency



Source link

Graphene discovery could help generate hydrogen cheaply and sustainably

0
Graphene discovery could help generate hydrogen cheaply and sustainably


Researchers from The University of Warwick and the University of Manchester have finally solved the long-standing puzzle of why graphene is so much more permeable to protons than expected by theory.

Graphene – illustrative photo. Image credit: Pixabay (Free Pixabay license)

A decade ago, scientists at The University of Manchester demonstrated that graphene is permeable to protons, nuclei of hydrogen atoms.

The unexpected result started a debate in the community because theory predicted that it would take billions of years for a proton to permeate through graphene’s dense crystalline structure. This had led to suggestions that protons permeate not through the crystal lattice itself, but through the pinholes in its structure.

Now, writing in Nature, a collaboration between the University of Warwick, led by Prof. Patrick Unwin, and The University of Manchester, led by Dr. Marcelo Lozada-Hidalgo and Prof. Andre Geim, report ultra-high spatial resolution measurements of proton transport through graphene and prove that perfect graphene crystals are permeable to protons. Unexpectedly, protons are strongly accelerated around nanoscale wrinkles and ripples in the crystal.

The discovery has the potential to accelerate the hydrogen economy. Expensive catalysts and membranes, sometimes with significant environmental footprint, currently used to generate and utilise hydrogen could be replaced with more sustainable 2D crystals, reducing carbon emissions, and contributing to Net Zero through the generation of green hydrogen.

The team used a technique known as scanning electrochemical cell microscopy (SECCM) to measure minute proton currents collected from nanometre-sized areas. This allowed the researchers to visualise the spatial distribution of proton currents through graphene membranes.

If proton transport took place through holes as some scientists speculated, the currents would be concentrated in a few isolated spots. No such isolated spots were found, which ruled out the presence of holes in the graphene membranes.

Drs. Segun Wahab and Enrico Daviddi, leading authors of the paper, commented: “We were surprised to see absolutely no defects in the graphene crystals. Our results provide microscopic proof that graphene is intrinsically permeable to protons.”

Unexpectedly, the proton currents were found to be accelerated around nanometre-sized wrinkles in the crystals. The scientists found that this arises because the wrinkles effectively ‘stretch’ the graphene lattice, thus providing a larger space for protons to permeate through the pristine crystal lattice. This observation now reconciles the experiment and theory.

Dr. Lozada-Hidalgo said: “We are effectively stretching an atomic scale mesh and observing a higher current through the stretched interatomic spaces in this mesh – this is truly mind-boggling.”

Prof. Unwin commented: “These results showcase SECCM, developed in our lab, as a powerful technique to obtain microscopic insights into electrochemical interfaces, which opens up exciting possibilities for the design of next-generation membranes and separators involving protons.”

The authors are excited about the potential of this discovery to enable new hydrogen-based technologies. Dr. Lozada-Hidalgo said, “Exploiting the catalytic activity of ripples and wrinkles in 2D crystals is a fundamentally new way to accelerate ion transport and chemical reactions. This could lead to the development of low-cost catalysts for hydrogen-related technologies.”

Read the full paper here https://www.nature.com/articles/s41586-023-06247-6

Source: University of Warwick



Source link

AI Chatting: Free AI Chatbot at Your Service

0
AI Chatting: Free AI Chatbot at Your Service


In the past few years, artificial intelligence (AI) has taken great strides in the realm of online communications. The keywords “chat AI ask anything” have been increasingly growing. Also for this reason, chatbots have emerged as the digital companions of the future, designed to streamline and simplify our online experiences.

Among these, AI Chatting stands out as a free AI chatbot that promises to revolutionize the way we interact with digital services. This AI-powered chatbot is designed to assist you with any questions, problems, or tasks you may have. Starting from general information, recommendations or even just engaging in casual conversation, AI Chatting is at your service, providing you with the most accurate and relevant responses.

Without further ado, let’s get into the world of AI Chatting! It’s up to you to judge whether it truly lives up to the lofty expectations it sets.

Introduction of AI Chatting

In the midst of AI chatbots in the market, AI Chatting can be said as one of the oldest launched chatbots. It was first launched in 2020 by OpenAI, and yet claims to constantly improve and update over the time being. According to its reply, it was last updated in 2021 with the GPT-3 architecture.

AI Chatting is made of a line of codes and artificial intelligence. The fundamental aspect of it is its ability to learn and improve with use. It relies on machine learning to continually enhance its understanding and responses. For instance, if you ask a question that AI Chatting can’t answer, then it will take note of it and endeavor to improve its ability to respond to similar questions in the future.

One of the first things you’ll notice when using AI Chatting is its user-friendliness. Whether you’re tech-savvy or a newcomer to AI character chat, its intuitive interface ensures that anyone can interact with it effortlessly.

ywAAAAAAQABAAACAUwAOw== AI Chatting: Free AI Chatbot at Your Service

The Power of AI Chatting

For sure, this free AI chatbot brings a multitude of advantages to the table, making it a valuable tool for users seeking assistance in the digital realm. Some key advantages of using it include:

●      Versatility

No matter what you’re looking for, whether you are looking for information, insights, guidance, translating text, seeking inspiration, or even simply entertainment, AI Chatting can provide prompt responses to your queries, saving you precious time; it can compile information, analyze your inputs and deliver an appropriate solution in seconds.

●      Communication Channels

AI Chatting also serves as communication in various ways, one of which includes customer support portals by engage with visitors and providing instant assistance. It can handle initial inquiries, direct users to appropriate resources then escalate to a human agent if required.

Apart from that, translating a document can done in seconds. This AI chatbot is trained in tons of languages like English, Mandarin, Korean, Italian, German, etc. It breaks down language barriers and enables you to freely communicate with people from any part of the world.

●      Ease of Use

In an age of increasing digital concerns, AI Chatting has the commitment to take user privacy seriously. Other than ensuring that the generated content is appropriate and abides by community guidelines, it encrypts all conversations to safeguard your data and doesn’t store your chats. Respect for user’s privacy and confidentiality is its number one priority.

●      Cost Saving

Hiring a professional or training a team of human agents not only requires you to spend an amount of money but cannot guarantee its result as well. AI chatbots are developed and deployed based on our requirements, thus requiring minimal cost and yet able to handle a bunch of inquiries without added expenses. All you need is to invest in the beginning and it will serve you permanently.

Summary

In conclusion, AI Chatting represents a step forward in the field of AI chatbots and can be a valuable resource for those seeking online assistance. This AI chatbot provides up to 20 free credits per day, offering a wide range of useful functionalities, and is easy to use. Furthermore, its machine learning capability makes it increasingly effective with continuous use, and its attention to user privacy is commendable.

So, if you’re looking for a free AI chatbot that’s easy to use and offers a broad range of features, AI Chatting might be the right choice for you!



Source link

The big business of mental illness

0
grayscale photo of hospital bed

Psychologist Lisa Cosgrove, a professor at the University of Massachusetts, explained that More than 5% of young schoolchildren take psychotropic drugs daily. And although this was stated based on a study carried out to talk about the consumption of medical drugs in the United States, it can be extrapolated to any country, where psychiatry and the pharmaceutical industry have not stopped generating mental illnesses permanently.

In 1980 in the United States, 30 million boxes of antidepressants were prescribed, in 2012 this figure had reached 264 million prescriptions. What was the reason for this rebound? What has happened from 2012 to today? Perhaps the answer is as simple as it is dangerous: mental illness has become a business that generates billions of dollars in profits.

In 2014, a book already mentioned by me in previous reports was published, but it now acquires special relevance because similar complaints are currently being prepared in various publishers; is about Are we all mentally ill?, from the distinguished professor emeritus of the department of Psychiatry and Behavioral Sciences at Durham University, in North Carolina. But why is this book especially relevant, simply because its author, Allen Frances, was the president of the DSM IV working group and was part of the DSM III management team.

He himself confessed years later to having participated in said projects that After the publication in May 2013 of the DSM-V, there is almost no human behavior that cannot be classified at a given moment as a “mental disorder” and, therefore, susceptible to “solving” through drugs whose intake entails numerous side effects. .

Under the name DSM hides the misnamed Diagnostic and Statistical Manual of Mental Disorders. This manual has already been discredited ad nauseam by doctors and psychiatrists from around the world, among them the aforementioned Allen Frances, who actively participated in several of the manuals, however very soon and in the style of The Empire of Pain by the American journalist Patrick Radeen Keefe, another journalist, Robert Whitaker accompanied by the psychologist Lisa Cosgrove, will see his book Psychiatry under the influence, translated into Spanish and very possibly into other languages ​​in half the world, despite the different attempts to silence its publication . In it they tell the story how an allegedly corrupt conspiracy cataloged mental illnesses and triggered massive use of psychotropic drugs around the world. The person who writes the above is Daniel Arjona, a journalist from the newspaper El Mundo who on Friday, September 1, 2023, published, among other things, two important issues.

The first, the interesting words that Dr. Cosgrove transmitted to him by email where she put the point on an indisputable topic: (…) Over the past 35 years, psychiatry has transformed American culture. It has changed our view of childhood and what is expected of “normal” children, to the point that more than 5% of school-age young people now take a psychotropic drug daily. “It has changed our behavior as adults and, in particular, the way we seek to cope with emotional distress and difficulties in our lives.” And that is why millions of people around the world have fallen into the hands of psychotropic drugs with psychiatric endorsement. A real imprudence, a nonsense.

The second question that Whitaker and Cosgrove try to answer in their book, as reflected in Arjona’s article, is the following: (…) What is the thesis of this amendment to the entirety? Since the publication in 1980 of the third and decisive version of the DSM (today there are five, all of them under discussion), psychiatry has succumbed to institutional corruption on two fronts: that of big pharmaceutical companies and that of the “union influences” represented by an American Psychiatric Association voracious in defending and expanding its business. Having said the above, I encourage you to read some of the articles published under my signature on antidepressants and the illegal commission business in China, for example, where you can get an idea of ​​the magnitude of the tragedy to which the humanityIs the DSM to blame? Categorically not. The blame lies with a system that allows large pharmaceutical companies to easily advertise “happiness” pills for all kinds of problems. Something similar to what happened at the time with ADHD (Attention Deficit Hypersensitivity Disorder). In the 1990s (1990), this “disease” barely occupied a small corner in the profits of the enormous and enormous pharmaceutical industry, the income generated by this disease barely reached 70 million dollars, but some years later, when The DSM IV was published, an enormous business possibility was seen. Psychiatrists had opened a door with their diagnostic assumptions and patents were created, beginning to generate a huge advertising campaign aimed at patients (the general public) and doctors. Everyone saw the sky open when it was accepted that with a pill, “hyperactive” children would stop crying out, and teachers and families would finally have moments of respite. The company “bought” said benefit and with the slogan “Consult your doctor”, In just a few years, a market has tripled, and is increasing, as society in general has accepted that it is acceptable to medicate children from an early age. It has been accepted that many university students talk about mental health and take medication and also, by teachers, mothers/fathers and doctors, that a quiet classroom benefits the emotional health of children.

In some countries, the consumption of this type of products, antidepressants, anxiolytics, is making, with increasing intensity, sick societies, where access to these drugs It is much simpler than it may seem to us. That is why, with nuances, lists of countries with enormous consumption of this type of products are periodically made, among which we can highlight, without the need to give percentages, the following 10: United States, Iceland, Australia, Portugal, United Kingdom United Kingdom, Canada, Sweden, Belgium, Denmark and Spain. As a fact to take into account, due to proximity, comment that in Spain, in information dated 2022, the headline read: The data after a decade of “medicine culture” in Spain: the consumption of antidepressants has grown by 40%. Giving two issues as keys to this increase: The improvement of several drugs joins the industry strategies and their use as a resource to quickly finish a consultation.

Could the prescription of antidepressants or anxiolytics have become an absurd excuse to get rid of patients in a medical consultation? I imagine that we will have to look for an answer for this in the future, although I am afraid of what we are going to find.

Perhaps, as a preview of future research, I will stick with one of the answers that Allen Frances gave in one of his many interviews to the question:

-Isn’t the increase in the number of alleged “mental illnesses” then due to both psychiatrists and the pharmaceutical industry?

-Certainly. Look, pharmaceutical multinationals, especially those grouped under the expression Big Pharma, have become dangerous; and not only in the field of Psychiatry. In the United States, for example, there are now more deaths each year from drug overdoses than from traffic accidents. Most caused by prescription narcotics, not illegal drugs. Of course, pharmaceutical multinationals are experts at inventing diseases to sell drugs; In fact, they invest billions of dollars in spreading misleading messages.

As I finished transcribing Allen’s response, a dystopia came to mind where I imagined drug cartels advertising their product in the media of any kind, without any control and with the approval of many members of a dystopian society, authorities, media, teachers, fathers, mothers, etc., who obtained a profit, whether emotional or lucrative, with the widespread consumption of said product.

Information sources:
Graphic: Which countries consume the most antidepressants? | Statista
Medication data: consumption of antidepressants grows by 40% (rtve.es)
DSALUD (magazine) no. 177, December 2014
El Mundo Newspaper. Friday, September 1, 2023
Book: Are we all mentally ill? Author: Allen Frances. Ariel Editorial – 2014

Originally published at LaDamadeElche.com

On the Road to Spotting Alien Life

0
On the Road to Spotting Alien Life


The focal plane mask for the Coronagraph Instrument on NASA’s Nancy Grace Roman Space Telescope. Each circular section contains multiple “masks” – carefully engineered, opaque obstructions designed to block starlight. Image credit: NASA/JPL-Caltech

In early August, scientists and engineers gathered in a small auditorium at Caltech to discuss how to build the first space telescope capable of detecting alien life on planets like Earth.

The proposed mission concept, the Habitable Worlds Observatory (HWO), would be the next powerful astrophysics observatory after NASA’s James Webb Space Telescope (JWST). It would be able to study stars, galaxies, and a host of other cosmic objects, including planets outside our solar system, known as exoplanets, potentially even the alien life.

Though finding alien life on exoplanets may be a long shot, the Caltech workshop aimed to assess the state of technology HWO needs to search for life elsewhere.

“Before we can design the mission, we need to develop the key technologies as much as possible,” says Dimitri Mawet, a member of the Technical Assessment Group (TAG) for HWO, the David Morrisroe Professor of Astronomy, and a senior research scientist at the Jet Propulsion Laboratory (JPL), which Caltech manages for NASA.

“We are in a phase of technology maturation. The idea is to further advance the technologies that will enable the Habitable Worlds Observatory to deliver its revolutionary science while minimizing the risks of cost overruns down the line.”

First proposed as part the National Academy of Sciences’ Decadal Survey on Astronomy and Astrophysics 2020 (Astro2020), a 10-year roadmap that outlines goals for the astronomy community, HWO would launch in the late 2030s or early 2040s. The mission’s observing time would be divided between general astrophysics and exoplanet studies.

Sara Seager of MIT gave a talk at the Caltech workshop titled

Sara Seager of MIT gave a talk at the Caltech workshop titled “Towards Starlight Suppression for the Habitable Worlds Observatory.” Image credit: Caltech

“The Decadal Survey recommended this mission as its top priority because of the transformational capabilities it would have for astrophysics, together with its ability to understand entire solar systems outside of our own,” says Fiona Harrison, one of two chairs of the Astro2020 decadal report and the Harold A. Rosen Professor of Physics at Caltech, as well as the Kent and Joyce Kresa Leadership Chair of the Division of Physics, Mathematics and Astronomy.

The space telescope’s ability to characterize the atmospheres of exoplanets, and therefore look for signatures that could indicate alien life, depends on technologies that block the glare from a distant star.

There are two main ways of blocking the star’s light: a small mask internal to the telescope, known as a coronagraph, and a large mask external to the telescope, known as a starshade. In space, starshades would unfurl into a giant sunflower-shaped structure, as seen in this animation.

Artistic concept of an Earth-like planet in the habitable zone of its star. New observatory will search for alien life.

Artist’s concept of an Earth-like planet in the habitable zone of its star. New observatory will search for alien life. Image credit: NASA Ames/JPL-Caltech/T. Pyle

In both cases, the light of stars is blocked so that faint starlight reflecting off a nearby planet is revealed. The process is similar to holding your hand up to block the sun while snapping a picture of your smiling friends.

By directly capturing the light of a planet, researchers can then use other instruments called spectrometers to scrutinize that light in search of the chemical signatures. If any life is present on a planet orbiting a distant star, then the collective inhales and exhales of that life might be detectable in the form of biosignatures.

“We estimate there are as many as several billion Earth-size planets in the habitable zone in our galaxy alone,” says Nick Siegler, the chief technologist of NASA’s Exoplanet Exploration Program at JPL. The habitable zone is the region around a star where temperatures are suitable for liquid water.

“We want to probe the atmospheres of these exoplanets to look for oxygen, methane, water vapor, and other chemicals that could signal the presence of life. We aren’t going to see little green [alien] men but rather spectral signatures of these key chemicals, or what we call biosignatures.”

According to Siegler, NASA has decided to focus on the coronagraph route for the HWO concept, building on recent investments in NASA’s Nancy Grace Roman Space Telescope, which will utilize an advanced coronagraph for imaging gas-giant exoplanets. (Caltech’s IPAC is home to the Roman Science Support Center).

Today, coronagraphs are in use on several other telescopes, including the orbiting JWST, Hubble, and ground-based observatories.

Mawet has developed coronagraphs for use in instruments at the W. M. Keck Observatory atop Maunakea, a mountain on the Big Island of Hawai’i.

The most recent version, known as a vortex coronagraph, was invented by Mawet and resides inside the Keck Planet Imager and Characterizer (KPIC), an instrument that allows researchers to directly image and study the thermal emissions of young and warm gas-giant exoplanets.

The coronagraph cancels out a star’s light to the point where the instrument can take pictures of planets that are about a million times fainter than their stars. That allows researchers to characterize the atmospheres, orbits, and spins of young gas-giant exoplanets in detail, helping to answer questions about the formation and evolution of other solar systems.

But directly imaging a twin Earth planet—where life as we know it is most likely to flourish—will take a massive refinement of current technologies. Planets like Earth that orbit sun-like stars in the habitable zone are easily lost in the glare of their stars.

Our own sun, for example, outshines the light of Earth by 10 billion times. For a coronagraph to achieve this level of starlight suppression, researchers will have to push their technologies to the limit.

“As we get closer and closer to this required level of starlight suppression, the challenges become exponentially harder,” Mawet says.

The Caltech workshop participants discussed a coronagraph technique that involves controlling light waves with an ultraprecise deformable mirror inside the instrument.

While coronagraphs can block out much of a star’s light, stray light can still make its way into the final image, appearing as speckles. By using thousands of actuators that push and pull on the reflective surface of the deformable mirror, researchers can cancel the blobs of residual starlight.

The upcoming Nancy Grace Roman Space Telescope will be the first to utilize this type of coronagraph, which is referred to as “active” because its mirror will be actively deformed. After more tests at JPL, the Roman coronagraph will ultimately be integrated into the final telescope at NASA’s Goddard Space Flight Center and launched into space no later than 2027.

The Roman Coronagraph Instrument will enable astronomers to image exoplanets possibly up to a billion times fainter than their stars. This includes both mature and young gas giants as well as disks of debris left over from the planet-formation process.

“The Roman Coronagraph Instrument is NASA’s next step along the path to finding life outside our solar system,” says Vanessa Bailey, the instrument technologist for Roman’s coronagraph at JPL.

“The performance gap between today’s telescopes and the Habitable Worlds Observatory is too large to bridge all at once. The purpose of the Roman Coronagraph Instrument is to be that intermediate steppingstone. It will demonstrate several of the necessary technologies, including coronagraph masks and deformable mirrors, at levels of performance never before achieved outside the lab.”

The quest to directly image an Earth twin around a sun-like star will mean pushing the technology behind Roman’s coronagraph even further.

“We need to be able to deform the mirrors to a picometer-level of precision,” Mawet explains.

“We will need to suppress the starlight by another factor of roughly 100 compared to Roman’s coronagraph. The workshop helped guide us in figuring out where the gaps are in our technology, and where we need to do more development in the coming decade.”

Other topics of conversation at the workshop included the best kind of primary mirror for use with the coronagraph, mirror coatings, dealing with damage to the mirrors from micrometeoroids, deformable mirror technologies, as well as detectors and advanced tools for integrated modeling and design.

Engineers also provided a status update on the starshade and its technological readiness.

Meanwhile, as technology drives ahead, other scientists have their eyes on the stars in search of Earth-like planets and possibly alien life that the HWO would image.

More than 5,500 exoplanets have been discovered so far, but none of them are truly Earth-like. Planet-hunting tools, such as the new Caltech-led Keck Planet Finder (KPF) at the Keck Observatory, have become better equipped to find planets by looking for the tugs they exert on their stars as they orbit around.

Heavier planets exert more of a tug, as do planets that orbit closer to their stars. KPF was designed to find Earth-size planets in the habitable zones of small red stars (the habitable zones for red stars are closer in). With additional refinements over the next several years, KPF may be able to detect Earth twins.

By the time HWO would launch in the late 2030s or early 2040s, scientists hope to have a catalog of at least 25 Earth-like planets to explore.

Despite the long road ahead, the scientists at the workshop eagerly discussed these challenges with their colleagues who had traveled to Pasadena from around the country. JPL director Laurie Leshin (MS ’89, PhD ’95) gave a pep talk at the start of the meeting.

“It’s an exciting and daunting challenge,” she said. “But that’s what we all live for. We don’t do it alone. We do it in collaboration.”

Written by Whitney Clavin

Source: Caltech



Source link

Farm Dams Can Be Converted Into Renewable Energy Storage Systems

0
Farm Dams Can Be Converted Into Renewable Energy Storage Systems


New research suggests Australia’s agricultural water reservoirs could be an innovative energy storage solution for variable renewables.

Over 30,000 micro-pumped hydro energy storage systems could potentially be made leveraging existing agricultural dams. Image credit: Pixabay, free license

Tens of thousands of small-scale hydroenergy storage sites could be built from Australia’s farm dams, supporting the uptake of reliable, low-carbon power systems in rural communities, new UNSW-Sydney-led research suggests.

The study, published in Applied Energy, finds agricultural reservoirs, like those used for solar-power irrigation, could be connected to form micro-pumped hydroenergy storage systems – household-size versions of the Snowy Hydro hydroelectric dam project. It’s the first study in the world to assess the potential of these small-scale systems as an innovative renewable energy storage solution.

Farm irrigation system.

Farm irrigation system. Image credit: deraugustodesign via Pixabay, CC0 Public Domain

With the increasing shift towards variable energy sources like wind and solar photovoltaics, storing surplus energy is essential for ensuring a stable and reliable power supply. In other words, when the sun isn’t up or the wind isn’t blowing, stored energy can help balance energy supply and demand in real time and overcome the risk of shortages and overloads. 

In a micro-pumped hydro energy storage system, excess solar energy from high-production periods is stored by pumping water to a high-lying reservoir, which is released back to a low-lying reservoir when more power is needed, flowing through a turbine-connected generator to create electricity.

However, constructing new water reservoirs for micro-pumped hydro energy storage can be expensive. 

“The transition to low-carbon power systems like wind and solar photovoltaics needs cost-effective energy storage solutions at all scales,” says Dr Nicholas Gilmore, lead author of the study and lecturer at the School of Mechanical and Manufacturing Engineering at UNSW Engineering.

“We thought – if you’re geographically fortunate to have two significant water volumes separated with sufficient elevation, you might have the potential to have your own hydro energy storage system.”

Micro-pumped hydro energy storage systems store excess solar energy from high-production periods by pumping water to a high-lying reservoir, which is released back to a low-lying reservoir when more power is needed.

Micro-pumped hydro energy storage systems store excess solar energy from high-production periods by pumping water to a high-lying reservoir, which is released back to a low-lying reservoir when more power is needed. Image credit: UNSW

Unlocking the untapped potential of farm dams

For the study, the team, which also included researchers from Deakin University and the University of Technology Sydney, used satellite imagery to create unique agricultural reservoir pairings across Australia from a 2021 dataset of farm dams.

They then used graph theory algorithms – a branch of mathematics that models how nodes can be organised and interconnected – to filter commercially promising sites based on minimum capacity and slope. 

“If you have a lot of dams in close proximity, it’s not viable to link them up in every combination,” says Dr Thomas Britz, co-author of the study and senior lecturer at UNSW Science’s School of Mathematics and Statistics. So, we use these graph theory algorithms to connect the best dam configurations with a reasonable energy capacity.”

From nearly 1.7 million farm dams, the researchers identified over 30,000 sites across Australia as promising for micro-pumped hydro energy storage. The average site could provide up to 2 kW of power and 30 kWh of usable energy – enough to back up a South Australian home for 40 hours.

“We identified tens of thousands of these potential sites where micro-pumped hydro energy storage systems could be installed without undertaking costly reservoir construction,” Dr Gilmore says. “That’s thousands of households that could potentially increase their solar usage, saving money on their energy bills, and reducing their carbon footprint.”

The research team also benchmarked a micro-pumped hydro site to a commercially available lithium-ion battery in solar-powered irrigation systems. Despite a low discharge efficiency, they found the pumped hydro storage was 30 per cent cheaper for a large single cycle load due to its high storage capacity.

“While the initial outlay for a micro-pumped hydro energy storage system is higher than a battery, the advantages are larger storage capacity and potential durability for decades,” Dr Gilmore says. “But that cost is significantly reduced anyway by capitalising on existing reservoirs, which also has the added benefit of less environmental impact.”

Building micro-pumped hydro energy power systems from existing farm dams could also assist rural areas susceptible to power outages that need a secure and reliable backup power source. Battery backup power is generally limited to less than half a day, while generators, though powerful, are dependent on affordable fuel supply and produce harmful emissions.

“People on the fringes of the electricity network can be more exposed to power outages, and the supply can be less reliable,” Dr Gilmore says. “If there’s a power outage during a bushfire, for example, a pumped hydro system will give you enough energy to last a day, whereas a battery typically lasts around eight hours.”

Although encouraging, the researchers say some limitations of the study require further analysis, including fluctuations in water availability, pump scheduling and discharge efficiency.

“Our findings are encouraging for further development of this emerging technology, and there is plenty of scope for future technological improvements that will make these systems increasingly cheaper over time,” Dr Gilmore says. 

“The next step would be setting up a pilot site, testing the performance of a system in action and modelling it in detail to get real-world validation – we have 30,000 potential candidates!”

Source: UNSW



Source link

Security of Smart Grids with Interacting Digital Systems

0
Security of Smart Grids with Interacting Digital Systems


New methods to analyze cyber security risk in cyber-physical electric power systems.

The increased electrification of society and the need to manage new resources (such as renewable energy sources and flexible resources) and new loads (such as electric vehicles) is changing the electric power system.

A digital system, printed-circuit board – illustrative photo. Image credit: Bermix Studio via Unsplash, free license

The extent of sensors, communication, and automation is increasing, and monitoring and control of the electric power grid is becoming more active and digitalised. The result is a cyber-physical electric power system where the operation of the physical power system increasingly depends on data transmitted through digital networks.

This development increases the number of potential entry points for an attacker and makes the systems more difficult to protect. Also, society is more dependent on electric power than ever before, and the consequences of a successful cyber-attack on interacting digital systems may become catastrophic.

Therefore, we need appropriate methods to assess and reduce cyber security risks in cyber-physical electric power systems. In the InterSecure project, SINTEF Energi, SINTEF Digital, NTNU and Proactima have developed such methods in collaboration with Norwegian grid companies and authorities.

What is a cyber-physical electric power system?

We understand a cyber-physical system as a system of physical components controlled via digital networks.

Commonly, cyber-physical electric power grids are called smart grids. This name emphasises the enhanced possibilities for intelligence, i.e., control, monitoring, and automation, brought to electric grids when they are increasingly connected to digital networks.

What worries the grid operators today?

The emerging smart grid, with its increasing interconnection and exchange of data, increase the number of actors and stakeholders in the operation of power systems. This can potentially cause several new or changed threats and vulnerabilities.

Discussions in the project have revealed some key sources of threats and vulnerabilities that the grid operators worry about today, and that are expected to become even more relevant in the future:

  • Extended digital networks that increase the number of possible entry points for cyber attackers,
  • new technology, components and systems that are rapidly introduced,
  • new connections between administrative IT systems and control systems that increase data flow across systems,
  • increased system complexity,
  • more interfaces between interdependent applications or systems, and
  • dependence on digital services from external suppliers.

The grid companies must be able to understand and handle new risks due to these system developments.

What kind of methods do the grid operators need to address their concerns?

The grid operators in the project secure their systems and manage risks according to current regulations. The main relevant regulations are Energiloven, Kraftberedskapsforskriften and Sikkerhetsloven.

Furthermore, the grid operators collect and use updated threat information from organisations providing notification services, such as KraftCERT, PST (Norwegian Police Security Service) and NSM (Norwegian National Security Authority).

Although the power supply is reliable today, and current regulations and risk management practices are well established, the grid operators are not well equipped to handle the new sources of threats and vulnerabilities described in the previous section.

Traditional power system risk management is not focused on capturing the intentional nature of cyber security incidents, the widespread entry points due to the far-reaching nature of digital networks, nor the vulnerabilities to cyber attackers exploiting these entry points.

Also, cyber security risk and traditional risk analysis are carried out separately. This approach is not optimal, as it does not enable the assessment of potential vulnerabilities due to system interconnections, interdependencies and complexity.

In the following, risk assessment methods developed in the InterSecure project are briefly described.

Framework for risk assessment of cyber-physical electric power systems

The framework is based on the ISO 31000 and NS 5814 standards. It emphasises not only the physical system but the entire system of systems that is included in the operation of smart grids.

In fact, as smart grids develop and the system becomes more complex, it will be fruitless trying to understand the entire system and how all the elements relate and interact. The sheer size and complexity of the system will make this impossible.

Therefore, the risk management of the system needs to be addressed at a more high-level perspective, before focusing in on different sections or areas of the system.

As part of the InterSecure project, a risk management framework has been proposed that enables a more iterative approach to manage the risk of complex socio-technical systems, such as smart grids.

The framework follows a “plan, do, check, act” structure that is common in risk management frameworks. It consists of three main phases: plan, assess and manage as well as three continuous phases of communication and consultation, recording and reporting, and monitoring and review.

Figure 1 Proposed risk management framework for interacting digital systems in smart grids

Figure 1 Proposed risk management framework for interacting digital systems in smart grids

The overall structure of the risk management framework is that of an iterative process. It includes considering the complexity within the system, and rather than trying to understand and model the entire system, it instead takes an incremental, top-down approach.

This allows the system to first be addressed from a high-level perspective and then become more familiar with the different areas and risks of the system, finding the right level to manage the different risks.

Threat modelling

Threat modelling for interacting digital systems is the exercise of analysing how a software or a system can be attacked with the aim of protecting against such attacks. While several methods exist, one of the more well-known methods is STRIDE (Spoofing, Tempering, Repudiation, Information disclosure, Denial of service, Elevation of privilege).

STRIDE starts by creating a model of the system to visualise how and what type of data is being transmitted between the different parts of the system. As an example, a part of the model used in InterSecure is shown in Figure 2. Based on this model, threats (i.e., potential attacks) are identified for the different parts of the system.

To aid the STRIDE threat modelling process, Microsoft has developed the Microsoft Threat Modeling tool. This tool provides a graphical user interface to build the model of the system and a structured way of identifying and evaluating threats.

The tool is originally aligned towards threat modelling of software, but as the tool allows users to create their own template, we have adapted the tool to identify threats against the smart grid. Here you can find the template developed.

Figure 2: Model used in STRIDE threat modeling

Figure 2: Model used in STRIDE threat modeling

In this project, we performed threat modelling of a digital secondary substation to test and demonstrate the use of the tool in a smart grid context.

Guided by the threat categories making up the STRIDE mnemonic, threats towards the substation from each of the categories were identified. Information disclosure and denial of service threats were identified as the most critical mainly due to the simplicity of performing such attacks.

The reason is that such threats were evaluated to potentially have relatively serious consequences without requiring specific knowledge or specialised tools to execute.

Communication impact simulations

Figure 3 Impact simulation model in Mininet network emulator

Figure 3 Impact simulation model in Mininet network emulator

We have developed two simulation models to verify the most critical threats (sniffing and availability attacks) identified by threat modelling. Both models have a topology comprising two digital secondary substations and a control centre.

The first model was created within the Mininet network emulator and selected as the primary model due to its easy usability and transportability, as the entire model is composed from a single virtual machine. The schema of the first model is shown in Figure 3.

The second model was created using separate virtual machines for each component (RTUs, gateways, routers and the monitoring device).

This model was used only for performance testing during Denial-of-Service attacks as its results were more closely corresponding to reality when compared to the Mininet model.

Performance evaluation of the model was done and described in the article “Threat Modeling of a Smart Grid Secondary Substation“. This model was not further considered due to its complexity and lack of easy export. The model schema is shown in Figure 4.

Figure 4 Impact simulation model using virtual machines

Figure 4 Impact simulation model using virtual machines

Both impact simulation models used emulated IEC 104 communication corresponding to data from The National Smart Grid Lab in Trondheim.

The results gained from the simulation models testing can be used by grid operators to improve grid security, for example, by tuning security devices such as firewalls. The first model was provided to all members of the InterSecure project and was also demonstrated.

In this demonstration, all the participants could install the model on their devices and learn the basic control of the model in a provided scenario. A demonstration is also available on Youtube.

Assessment of vulnerabilities and failure consequences

Smart grids are complicated systems, so no single model or framework can uncover all vulnerabilities. Hence, there is a need for a selection of models and frameworks to help the grid operators viewing the problem at hand from different angles.

To complement the other methods in the InterSecure project, an approach for assessing vulnerabilities and failure consequences for cyber-physical power grids based on the bow-tie model has been developed.

The approach is illustrated in Figure 5. The first part of the analysis is to perform a bow-tie analysis for a selected scenario for a specific critical asset, i.e., an asset that can directly impact the distribution of electricity.

Next, assumptions on the operation state of the power system are made, and the coping capacity and consequences at the system level are assessed.

Figure 5 A bow-tie model with the critical asset event at the center. The left side illustrates the four zones in the Purdue model. On the right side, the zone closest to the center is related to the event tree from the asset perspective, while the rest of the right side is related to the power system consequence assessment. The vertical orange bars represent barriers.

Figure 5 A bow-tie model with the critical asset event at the center. The left side illustrates the four zones in the Purdue model. On the right side, the zone closest to the center is related to the event tree from the asset perspective, while the rest of the right side is related to the power system consequence assessment. The vertical orange bars represent barriers. Adapted from Sperstad et al., 2020.

The proposed approach has been tested on a case related to conditional connection agreements at a Norwegian DSO. Advantages of the proposed approach are that the bow-tie model is well-known in the industry.

Thus, little time was needed to explain the method to the participants. The bow-tie model was found to be flexible enough to incorporate both traditional threats, such as technical failures and cyber threats from malicious actors, in the same diagram.

Further, the approach aided in building a common understanding among participants from the different departments of the grid operator by visualising threats, vulnerabilities, barriers, and consequences in the same diagram.

The bow-tie analyses are, however, time-consuming to perform. Considerable time is also needed to process the results before they can be used further in the risk management process.

Another consequence of the flexibility of the bow-tie method is that successful use is dependent on the ability of the facilitator to guide the discussion in the group so that relevant threats and vulnerabilities are discussed.

Because of this, there is a need for a structured overall approach to ensure that this type of analysis is used on the relevant assets and threats.

To summarize the methods tested in InterSecure are applicable in different situations where different levels of detail is needed. The suggested framework can be used at a high level.

While threat model can be used to identify information flows and threats and further “sort” out the most important threats for more detailed analysis and follow-up.

The simulation model is useful for detailed testing of concrete attacks with realistic communication- and network topology, while the assessment of vulnerabilities is useful for in depth analysis of both physical and cyber threats, vulnerabilities and barriers. The DSO should test the methods and plan which method to use when.

Source: Sintef



Source link

Webb Reveals New Structures Within Iconic Supernova

0
Webb Reveals New Structures Within Iconic Supernova


NASA’s James Webb Space Telescope has begun the study of one of the most renowned supernovae, SN 1987A (Supernova 1987A).

Located 168,000 light-years away in the Large Magellanic Cloud, SN 1987A has been a target of intense observations at wavelengths ranging from gamma rays to radio for nearly 40 years, since its discovery in February of 1987.

Recent observations by Webb’s NIRCam (Near-Infrared Camera) provide a crucial clue to our understanding of how a supernova develops to shape its remnant.

Webb’s NIRCam (Near-Infrared Camera) captured this detailed image of SN 1987A (Supernova 1987A). At the center, material ejected from the supernova forms a keyhole shape. Just to its left and right are faint crescents newly discovered by Webb. Beyond them an equatorial ring, formed from material ejected tens of thousands of years before the supernova explosion, contains bright hot spots. Exterior to that is diffuse emission and two faint outer rings. In this image blue represents light at 1.5 microns (F150W), cyan 1.64 and 2.0 microns (F164N, F200W), yellow 3.23 microns (F323N), orange 4.05 microns (F405N), and red 4.44 microns (F444W). Image Credit: NASA, ESA, CSA, M. Matsuura (Cardiff University), R. Arendt (NASA’s Goddard Spaceflight Center & University of Maryland, Baltimore County), C. Fransson

This image reveals a central structure like a keyhole. This center is packed with clumpy gas and dust ejected by the supernova explosion. The dust is so dense that even near-infrared light that Webb detects can’t penetrate it, shaping the dark “hole” in the keyhole.

A bright, equatorial ring surrounds the inner keyhole, forming a band around the waist that connects two faint arms of hourglass-shaped outer rings. The equatorial ring, formed from material ejected tens of thousands of years before the supernova explosion, contains bright hot spots, which appeared as the supernova’s shock wave hit the ring.

Now spots are found even exterior to the ring, with diffuse emission surrounding it. These are the locations of supernova shocks hitting more exterior material.

While these structures have been observed to varying degrees by NASA’s Hubble and Spitzer Space Telescopes and Chandra X-ray Observatory, the unparalleled sensitivity and spatial resolution of Webb revealed a new feature in this supernova remnant – small crescent-like structures.

These crescents are thought to be a part of the outer layers of gas shot out from the supernova explosion. Their brightness may be an indication of limb brightening, an optical phenomenon that results from viewing the expanding material in three dimensions.

In other words, our viewing angle makes it appear that there is more material in these two crescents than there actually may be.

The high resolution of these images is also noteworthy. Before Webb, the now-retired Spitzer telescope observed this supernova in infrared throughout its entire lifespan, yielding key data about how its emissions evolved over time. However, it was never able to observe the supernova with such clarity and detail.

Webb’s NIRCam (Near-Infrared Camera) captured this detailed image of SN 1987A (Supernova 1987A), which has been annotated to highlight key structures. At the center, material ejected from the supernova forms a keyhole shape. Just to its left and right are faint crescents newly discovered by Webb. Beyond them an equatorial ring, formed from material ejected tens of thousands of years before the supernova explosion, contains bright hot spots. Exterior to that is diffuse emission and two faint outer rings. In this image blue represents light at 1.5 microns (F150W), cyan 1.64 and 2.0 microns (F164N, F200W), yellow 3.23 microns (F323N), orange 4.05 microns (F405N), and red 4.44 microns (F444W).

Webb’s NIRCam (Near-Infrared Camera) captured this detailed image of SN 1987A (Supernova 1987A), which has been annotated to highlight key structures. At the center, material ejected from the supernova forms a keyhole shape. Just to its left and right are faint crescents newly discovered by Webb. Beyond them an equatorial ring, formed from material ejected tens of thousands of years before the supernova explosion, contains bright hot spots. Exterior to that is diffuse emission and two faint outer rings. In this image blue represents light at 1.5 microns (F150W), cyan 1.64 and 2.0 microns (F164N, F200W), yellow 3.23 microns (F323N), orange 4.05 microns (F405N), and red 4.44 microns (F444W). Image credits: NASA, ESA, CSA, M. Matsuura (Cardiff University), R. Arendt (NASA’s Goddard Spaceflight Center & University of Maryland, Baltimore County), C. Fransson (Stockholm University), and J. Larsson (KTH Royal Institute of Technology). Image credit: A. Pagan

Despite the decades of study since the supernova’s initial discovery, there are several mysteries that remain, particularly surrounding the neutron star that should have been formed in the aftermath of the supernova explosion.

Like Spitzer, Webb will continue to observe the supernova over time. Its NIRSpec (Near-Infrared Spectrograph) and MIRI (Mid-Infrared Instrument) instruments will offer astronomers the ability to capture new, high-fidelity infrared data over time and gain new insights into the newly identified crescent structures.

Further, Webb will continue to collaborate with Hubble, Chandra, and other observatories to provide new insights into the past and future of this legendary supernova.

The James Webb Space Telescope is the world’s premier space science observatory. Webb is solving mysteries in our solar system, looking beyond to distant worlds around other stars, and probing the mysterious structures and origins of our universe and our place in it. Webb is an international program led by NASA with its partners, ESA (European Space Agency) and the Canadian Space Agency.

Source: NASA



Source link