7.3 C
Brussels
Saturday, November 16, 2024
Home Blog Page 279

Immunotherapy substantially increases survival of people with lymphomatoid granulomatosis

0
Immunotherapy substantially increases survival of people with lymphomatoid granulomatosis

Results from a clinical trial conducted by researchers at the National Institutes of Health (NIH) show that people with low-grade lymphomatoid granulomatosis treated with interferon alfa-2b, a type of immunotherapy, can live for decades after diagnosis.

Doctor performing a surgery – illustrative photo. Image credit: Jafar Ahmed via Unsplash, free license

Lymphomatoid granulomatosis is a rare precancerous condition triggered by Epstein-Barr virus infection. Left untreated, the disease can progress to a high-grade form, which has a poorer prognosis and can quickly turn into an aggressive and fatal B-cell lymphoma.

Treatment with interferon alfa-2b led to the disappearance of a large lesion (top image) in the lungs of a patient with lymphomatoid granulomatosis, as shown in this computed tomography chest scan. Image credit: Center for Cancer Research

In the phase 2 trial, led by researchers in the Center for Cancer Research at the National Cancer Institute (NCI), part of NIH, patients treated with interferon alfa-2b lived for a median of about 20 years.

By contrast, past studies reported a median survival of less than two years for people with lymphomatoid granulomatosis.

The findings suggest that immunotherapy can prevent the progression of low-grade disease to high-grade disease. The results were published in Lancet Haematology.

“We have shown in this rare disorder that using a novel immunotherapy-based approach for low-grade disease is effective and improves survival compared with historical treatments such as chemotherapy and corticosteroids,” said Christopher J. Melani, M.D., of NCI’s Center for Cancer Research, who co-led the study.

“I think the results of this study represent a significant contribution to determining the standard-of-care treatment for this rare disease.”

Lymphomatoid granulomatosis causes an overproduction of white blood cells known as B lymphocytes. Patients typically have lesions in the lungs, central nervous system, skin, liver, and kidneys.

Symptoms can include cough, shortness of breath, fever, weight loss, and fatigue. Chemotherapy is currently the standard treatment for people with high-grade disease, but there is no standard treatment for low-grade disease.

“Although lymphomatoid granulomatosis is uncommon, the effects of high-grade disease can be debilitating,” said Jeffrey Cohen, M.D., chief of the Laboratory of Infectious Diseases at the National Institute of Allergy and Infectious Diseases and a co-leader of the study.

“We need better ways to prevent the disease from progressing to this more severe state, such as interferon alfa-2b.”

NIH researchers have been studying lymphomatoid granulomatosis since the 1980s. In the early 1990s, Wyndham Wilson, M.D., Ph.D., also of NCI’s Center for Cancer Research, hypothesized that low-grade disease results from a defective immune response to the Epstein-Barr virus and could therefore be treated with immunotherapy, whereas high-grade disease requires chemotherapy to curb uncontrolled cell growth.

He and his colleagues treated four people with low-grade lymphomatoid granulomatosis with interferon alfa-2b over a 5-year period, and the treatment eradicated all signs of the disease in three of those patients, known as a complete remission.

That study laid the foundation for the phase 2 trial of interferon alfa-2b in lymphomatoid granulomatosis, which has taken 30 years to complete because of the rarity of the disease and the challenges of recruiting enough patients for the study.

“This really illustrates the unique ability of NIH to do a study like this that nobody else could do and no one else ever has done for this particular disease,” said Dr. Wilson, who co-led the study.

The trial included 67 people with lymphomatoid granulomatosis, 37 with low-grade disease and 30 with high-grade disease. In all cases, the participants had not yet been treated for the disease or their disease had not responded to or had returned after other treatments.

During their initial treatment, most of the patients with low-grade disease received subcutaneous injections of interferon alfa-2b three times a week in increasing doses for about one year. Most of the patients with high-grade disease were given six cycles of intravenous chemotherapy every three weeks.

Both groups improved, with the disease disappearing in 27 of 44 patients (61%) treated with interferon alfa-2b, and 8 of 17 patients (47%) treated with chemotherapy.

After their initial treatment, some patients subsequently received the other therapy, called crossover treatment.

Patients with low-grade disease that worsened after immunotherapy were given chemotherapy, whereas patients with high-grade disease that came back after chemotherapy were given interferon alfa-2b. Previous work showed that after high-grade disease is eliminated with chemotherapy, low-grade disease can re-emerge.

The crossover treatments were also effective, with the disease disappearing in 4 of 8 patients (50%) treated with interferon alfa-2b after chemotherapy and 7 of 15 patients (47%) treated with chemotherapy after interferon alfa-2b.  

Median overall survival was 20.6 years for patients treated initially with interferon alfa-2b and 19.8 years for patients who crossed over to receive interferon alfa-2b. Median overall survival was 12.1 years for patients treated initially with chemotherapy and not reached for those who crossed over to receive chemotherapy. 

The most common side effect of interferon alfa-2b treatment was low white blood cell count, and the most common side effects with chemotherapy were low white blood cell count and infection. Serious side effects occurred in only a quarter of the patients treated with interferon alfa-2b, compared with nearly two-thirds of patients treated with chemotherapy.  

Many newer immunotherapies, such as nivolumab, could potentially be used to treat low-grade lymphomatoid granulomatosis and other Epstein-Barr virus–associated disorders and may have fewer side effects.

“The trial of interferon alfa-2b established that immunotherapy improves survival in patients with low-grade lymphomatoid granulomatosis,” Dr. Melani said. “Now we can look into more novel immunotherapies that are easier to tolerate to see if they can improve on the efficacy of our current treatment.” 

Source: NIH


Source link

Study sheds light on the dark side of AI

0
Study sheds light on the dark side of AI

To understand how to get artificial intelligence right, we need to know how it can go wrong, says researcher.

Artificial intelligence – artistic concept. Image credit: Icons8 Team via Unsplash, free license

Artificial intelligence is touted as a panacea for almost every computational problem these days, from medical diagnostics to driverless cars to fraud prevention.

But when AI fails, it does so “quite spectacularly,” says Vern Glaser of the Alberta School of Business. In his recent study, “When Algorithms Rule, Values Can Wither,” Glaser explains how AI’s efficiency imperative often subsumes human values, and why the costs can be high.

“If you don’t actively try to think through the value implications, it’s going to end up creating bad outcomes,” he says.

When bots go bad

Glaser cites Microsoft’s Tay as one example of bad outcomes. When the chatbot was introduced on Twitter in 2016, it was revoked within 24 hours after trolls taught it to spew racist language.

Then there was the “robodebt” scandal of 2015, when the Australian government used AI to identify overpayments of unemployment and disability benefits. But the algorithm presumed every discrepancy reflected an overpayment and automatically sent notification letters demanding repayment. The case was forwarded to a debt collector if someone didn’t respond.

By 2019, the program identified over 734,000 overpayments worth two billion Australian dollars (C$1.8 billion).

“The idea was that by eliminating human judgment, which is shaped by biases and personal values, the automated program would make better, fairer and more rational decisions at much lower cost,” says Glaser.

But the human consequences were dire, he says. Parliamentary reviews found “a fundamental lack of procedural fairness” and called the program “incredibly disempowering to those people who had been affected, causing significant emotional trauma, stress and shame,” including at least two suicides.

While AI promises to bring enormous benefits to society, we are now also beginning to see its dark underbelly, says Glaser. In a recent Globe and Mail column, Lawrence Martin points out AI’s dystopian possibilities, including autonomous weapons that can fire without human supervision, cyberattacks, deepfakes and disinformation campaigns. Former Google CEO Eric Schmidt has warned that AI could quite easily be used to construct killer biological weapons.

Glaser roots his analysis in French philosopher Jacques Ellul’s notion of “technique,” offered in his 1954 book The Technological Society, by which the imperatives of efficiency and productivity determine every field of human activity.

“Ellul was very prescient,” says Glaser. “His argument is that when you’re going through this process of technique, you are inherently stripping away values and creating this mechanistic world where your values essentially get reduced to efficiency. 

“It doesn’t matter whether it’s AI or not. AI in many ways is perhaps only the ultimate example of it.”

A principled approach to AI

Glaser suggests adherence to three principles to guard against the “tyranny of technique” in AI. First, recognize that because algorithms are mathematical, they rely on “proxies,” or digital representations of real phenomena.

One way Facebook gauges friendship, for example, is by how many friends a user has, or by the number of likes received on posts from friends.

“Is that really a measure of friendship? It’s a measure of something, but whether it’s actually friendship is another matter,” says Glaser, adding that the intensity, nature, nuance and complexity of human relationships can easily be overlooked.

“When you’re digitizing phenomena, you’re essentially representing something as a number. And when you get this kind of operationalization, it’s easy to forget it’s a stripped-down version of whatever the broader concept is.”

For AI designers, Glaser recommends strategically inserting human interventions into algorithmic decision-making, and creating evaluative systems that account for multiple values.

“There’s a tendency when people implement algorithmic decision-making to do it once and then let it go,” he says, but AI that embodies human values requires vigilant and continuous oversight to prevent its ugly potential from emerging.

In other words, AI simply reflects who we are — at our best and worst. The latter could take over without a good, hard look in the mirror.

“We want to make sure we understand what’s going on, so the AI doesn’t manage us,” he says. “It’s important to keep the dark side in mind. If we can do that, it can be a force for social good.”

Source: University of Alberta


Source link

Twinkling stars fuel interstellar dust

0
Twinkling stars fuel interstellar dust

Of the many kinds of stars, asymptotic giant branch (AGB) stars, usually slightly larger and older than our own sun, are known to produce interstellar dust. Dusty AGBs are particularly prominent dust producers, and the light they shine varies widely.

For the first time, a long-period survey has found the variable intensity of dusty AGBs coincides with variations in the amount of dust these stars produce. As this dust can lead to the creation of planets, its study can shed light on our own origins.

AGB. Artist’s impression of an asymptotic giant branch star. Image credit: Miyata, Tachibana, et al. CC-BY

You’ve probably heard of the James Webb Space Telescope (JWST) which has been in the news lately. It’s famous for being the largest and most sensitive space telescope designed to observe infrared (IR) light.

But long before the JWST took to the skies, two other IR space telescopes, AKARI and WISE, have been surveying the cosmos, both of which have ended their initial missions, but produced so much valuable data that astronomers are still finding new discoveries with it.

The latest finding from that data by doctoral student Kengo Tachibana from the University of Tokyo’s Institute of Astronomy and his team, could have implications for the study of the origins of life itself.

“We study stars, and IR light from them is a key source of information that helps us unlock their secrets,” said Tachibana.

“Until recently, most IR data was from very short-period surveys due to the lack of advanced dedicated platforms. But missions like AKARI and WISE have allowed us to take longer-period surveys of things. This means we can see how things might change over greater time periods, and what these changes might imply. Lately, we turned our attention to a certain class of star known as asymptotic giant branch stars, which are interesting because they are the main producers of interstellar dust.”

This interstellar dust is not the same stuff that accumulates on your floor when you forget to vacuum for a few days; it’s a name given to heavy elements that disperse from stars and lead to the formation of solid objects including planets.

Although it’s long been known that AGBs, especially dusty AGBs, are the main producers of dust, it’s unknown what the main drivers of dust production are and where we should be looking to find this out.

“Our latest study has pointed us in the right direction,” said Tachibana.

“Thanks to long-period IR observations, we have found that the light from dusty AGBs varies with periods longer than several hundred days. We also found that the spherical shells of dust produced by and then ejected by these stars have concentrations of dust that vary in step with the stars’ changes in luminosity. Of the 169 dusty AGBs surveyed, no matter their variability period, the concentrations of dust around them would coincide. So, we’re certain these are connected.”

Interstellar dust. Artist’s impression of how asymptotic giant branch stars exert pressure on solid matter. Image credit: Miyata, Tachibana, et al. CC-BY

Finding a connection between the concentration of dust and the variability of stars’ brightness is just the first step in this investigation, however.

Now the team wishes to explore the possible physical mechanisms behind dust production. For this, they intend to continuously monitor various AGB stars for many years. The University of Tokyo is nearing completion of a large ground-based telescope project, the University of Tokyo Atacama Observatory, in Chile, dedicated to making infrared observations.

Source: University of Tokyo


Source link

New breakthrough: 3D printed ferroelectric materials eliminate harmful bacteria, including E coli

0
New breakthrough: 3D printed ferroelectric materials eliminate harmful bacteria, including E coli

New fabrication technique gives materials antimicrobial properties, with scope to improve safety of implants including heart valves and stents.

A new way of using 3D printing to create infection-fighting materials for use as medical implants has been revealed in a new research paper.

An impression of bacteria being destroyed by the new ferroelectric composite material. Image credit: University of Bath

Engineers at the University of Bath, working with colleagues at the University of Ulster, have for the first time successfully created a new kind of ferroelectric composite material with antimicrobial properties using a novel multi-material 3D printing process.

They say the use of electrically responsive ferroelectric materials gives the implants infection-fighting properties, making them ideal for biomedical applications, such as heart valves, stents, and bone implants reducing the risk of infection for patients.

While commonplace, all biomedical implants pose some level of risk as materials can carry surface bio-contaminants that can lead to infection. Reducing this risk could be beneficial both to patients in the form of improved outcomes, and to healthcare providers thanks to reduced costs incurred by ongoing treatment.

The team has previously used this 3D printing technique for the fabrication of three-dimensional scaffolds for bone tissue engineering.

Dr Hamideh Khanbareh, a lecturer in materials and structures in Bath’s Department of Mechanical Engineering, is lead author of the research. She says that the development has the scope for wide-ranging applications.

She said: “Biomedical implants that can fight infection or dangerous bacteria such as E. coli could present significant benefits to patients and to healthcare providers.

“Our research indicates that the ferroelectric composite materials we have created have a great potential as antimicrobial materials and surfaces. This is a potentially game-changing development that we would be keen to develop further through collaboration with medical researchers or healthcare providers.”

The innovation comes thanks to ferroelectricity, a characteristic of certain polar materials that generate electrical surface charge in response to a change in mechanical energy or temperature. In ferroelectric films and implants, this electrical charge leads to the formation of free radicals known as reactive oxygen species (ROS), which selectively eradicate bacteria.

This comes about through the micro-electrolysis of water molecules on a surface of polarised ferroelectric composite material.

The composite material used to harness this phenomenon is made by embedding ferroelectric barium calcium zirconate titanate (BCZT) micro-particles in polycaprolactone (PCL) a biodegradable polymer widely used in biomedical applications. The mixture of the ferroelectric particles and polymer is then fed into a 3D bioprinter to create a specific porous ‘scaffold’ shape designed to have a high surface area to promote ROS formation.

Testing showed that even when contaminated with high concentrations of aggressive E. coli bacteria, the composite can completely eradicate the bacteria cells without external intervention, killing 70% within just 15 minutes.

Source: University of Bath


Source link

Missing crew – one more reason why the only Russian aircraft carrier will not sail anytime soon

0
Missing crew – one more reason why the only Russian aircraft carrier will not sail anytime soon

The official date is set: the Russian aircraft carrier “Admiral Kuznetsov” should be fully repaired by 2024. But even if technical works follow the schedule, assembling and training a new crew for this ship will certainly not happen quickly.

Russian aircraft carrier Admiral Kuznetsov – photo from 12 December 2011. Image credit: UK MOD via Flickr, CC BY-NC 2.0

“Admiral Kuznetsov” was designed to have 1,900 technical personnel onboard. The new crew should consist of 1,500 people, and such a reduction is explained by the fact that many of the internal systems will be automated and therefore will require less maintenance.

Still, a 1,500-large crew is not a simple thing to prepare, because most of these servicemen will need to undergo very specialized and long training before the vessel will be ready to perform at least basic tasks. Also, due to the ongoing Russian invasion of Ukraine, finding the necessary crew members will not be a simple task.

The repair and overhaul of “Admiral Kuznetsov” have been underway for 6 years. After the ship was stationed at the repair facility, the former crew was disbanded.

Of course, some operations onboard an aircraft carrier are closely similar to those in other types of military ships. Nevertheless, every aircraft carrier has its own specifics, including the fact that the crew must be 100% capable of maintaining aircraft and aviation equipment.

“Even if there are 1500 suitable sailors, their training and familiarization [with the aircraft carrier] will take months, since the Kuznetsov is the largest surface ship in Russia. Even the old team would find it difficult to learn how to properly operate with newly installed equipment. An improperly trained team can lead to major accidents,” commented Russian Navy expert Matus Smutny.

In a similar way, pilots of naval fighter jets Su-33 and MiG-29KR will need to complete additional training after 7 years spent without regular practice at sea.

Previously, Ukrainian military intelligence reported that the ship suffered from significant hull corrosion which is extremely difficult to repair.


Source link

Cheap but efficient drones going mainstream in the Ukrainian Army

0
Cheap but efficient drones going mainstream in the Ukrainian Army

Ukrainian engineers are learning to make the maximum use of simple and cheap drones. They not only construct their flying machines, but also develop specialized munitions and adapt novel attack methods.

Drone and its projectile release mechanisms constructed by the “Steel Hornets”. Image credit: Steel Hornets

The tactics used by the Armed Forces of Ukraine are usually based on drones that drop munitions while flying over the positions of the enemy. Such unmanned flyers are often made by volunteer communities using non-military grade parts.

There are only two main technical requirements: the drone must have the ability to be controlled remotely with a visual feedback, and it must be capable of carrying an explosive charge.

One such community named “Steel Hornets” which manufactures drones for Ukrainian soldiers and also constructs ammunition release systems and even optimized the construction of munitions for this purpose, recently demonstrated how they tested their new the testing of its new development – shrapnel ammunition.

Shrapnel ammunition for drones made by the Ukrainian volunteer organization “Steel Hornets”

Each shrapnel charge weighs 800 grams (around 1.8 lb) and has a diameter of 63 mm. It uses a mechanical fuse.

Generally, shrapnel explosives are packed with metal balls with a diameter of several millimeters that are scattered in the proximity of the detonation. Modern versions may contain sharp pieces of metal sometimes shaped as small arrows to increase their flight distance.

But it is not only ammunition that plays an important role on the battlefield. Ukrainian drone pilots are using a ‘diving’ technique to increase the efficiency of their strikes.

In his method, the carried drone flies along the charge release trajectory in order to increase the initial projectile velocity right after it is released from the drop mechanism. This also allows performing hits from a lower altitude, thereby increasing the accuracy and simplifying the aiming process.

Is it possible to recover the drone?

Yes, the drone does not hit the ground but is able to change its trajectory right after the drop. As Defense Express notes, the drone essentially turns into a miniature combat aircraft that is able to repeat maneuvres similar to those used in large airplanes.

The process could be simplified even further by programming an automated drone return function to optimize its recovery after it performed its mission.


Source link

Ukraine is using ancient anti-aircraft gun KS-19. But it works best for ground targets?

0
Ukraine is using ancient anti-aircraft gun KS-19. But it works best for ground targets?

The Soviet anti-aircraft weapon KS-19 entered production in 1947. It was developed to replace old WW2-era 85 mm anti-aircraft guns. It is now very old, but the defenders of Ukraine are still using these weapons because they need to defend their country. But where did they get such old weapons from? And can they still be effective at all?

KS-19 is a 100 mm anti-aircraft gun, which can be pressed into service against ground targets. Image credit: Lvova Anastasiya via Wikimedia (CC BY-SA 3.0)

The Armed Forces of Ukraine uses anti-aircraft guns KS-19, which entered service with the Army of the USSR in 1947 – 76 years ago. All the people involved in the development of this anti-aircraft gun are long dead. The KS-19 is a 100 mm gun designed to replace the 85 mm guns of World War II.

A video appeared on the Internet showing as many as 4 KS-19 guns in the hands of the defenders of Ukraine. The Ukraine Weapons Tracker does not exclude the possibility that these anti-aircraft guns are used against ground targets – the KS-19 is definitely suitable for that. In fact, the KS-19 has been used for direct and indirect fire against lightly armoured vehicles or personnel positions for the entirety of its service life.

Defense Express notes that these KS-19 may have appeared in the Armed Forces of Ukraine as trophies, seized during the counterattack campaign in the Kharkiv region. The Ukrainian army then captured at least 4 KS-19 guns. However, it is possible that these anti-aircraft guns were dragged out from Ukraine’s own weapon storage.

Ukraine stored multiple KS-19 guns in Balakliia, a city in Kharkiv Oblast at the beginning of the Russian invasion. This city was then captured by the advancing Russian forces. Invaders did use these KS-19s, sometimes even as decoys to divert the attention of the Ukrainian reconnaissance.

However, as Russian positions in the Kharkiv region weakened, Russia was swiftly kicked out of Balakliia and Ukraine retook its KS-19 and other weapons.

Journalists note that, despite its age, KS-19 can be useful for the Armed Forces of Ukraine. The characteristics of this weapon allow shooting at ground targets at a distance of up to 20 km, and air targets can be reached at an altitude of up to 15 km.

The shooting rate of the KS-19 is up to 15 rounds per minute. The entire weapon weighs just under 10 tonnes and needs a crew of 15 people, but for modern trucks that is hardly an issue. Attacking large air targets with the KS-19 now might be tricky, but smaller drones can feel the wrath of this ancient weapon.

The main problem with such old weapons is the availability of ammunition. It is possible that Bulgaria, which also operates the KS-19, helped with the ammunition. Ukraine probably had some suitable 100 mm rounds stashed as well as this system has been in use for decades.

 

Sources: Focus.ua, Wikipedia


Source link

What Is Robotic Process Automation in Manufacturing Industry?

0
What Is Robotic Process Automation in Manufacturing Industry?

Robotic Process Automation (RPA) is a technology based on software robots used to automate rule-based processes, especially repetitive processes. RPA is beneficial to companies in a variety of industries, and manufacturing is no exception. In this article, you can learn more about RPA in the manufacturing industry, including the benefits, as well as potential challenges.

Manufacturing – illustrative photo.

How Can a Manufacturing Company Use RPA?

RPA can be useful for automating a wide range of processes in manufacturing businesses including tasks performed by any company regardless of its specialization, such as invoicing, as well as operations that are more specific to the type of manufacturing industry.

When it comes to invoicing and accounting, RPA can be applied to a variety of processes, including invoice creation and verification, payment scheduling, collecting and processing information as well as updating the system.

A great example of RPA applications for tasks specifically related to manufacturing is inventory management. Companies can automate the process of tracking inventory to determine stock levels and issue purchase orders to manage supply levels with minimal human supervision. This can help businesses to optimize inventory levels and avoid downtime caused by insufficient supply.

RPA can also support order processing by automating order entries, validation, and confirmation.

Manufacturing companies can automate production line monitoring, such as tracking machine performance and notifying of potential or ongoing problems, which is very helpful for increasing production efficiency.

Quality control is another part of manufacturing companies’ workflows that can be automated. RPA can handle data collection and analysis, as well as report generation.

Read more about robotic process automation in manufacturing industry on: https://xplusglobal.com/resources/blog/robotic-process-automation-rpa-in-manufacturing-industry/

Why Should a Manufacturing Company Implement RPA?

Manufacturing companies that want to stay ahead of their competitors can effectively optimize their production, reduce costs, and increase profits by using RPA.

Automating repetitive, rule-based tasks saves significant resources and creates opportunities for employees to engage in tasks that can contribute to the development of their skills and expertise. This also eliminates human error and increases accuracy. Whether they are accounting processes or tasks related to production quality control, they can have a significant impact on the company’s performance.

Automation also speeds up processes and allows companies to do them outside of employees’ working hours if needed. While this can further increase business productivity, speed is especially important for some processes. This is true, for example, for handling tasks related to customer service and shipment.

Finally, the reduction in the need for manual labor makes companies more scalable, as they do not necessarily need to hire more employees to handle an increasing number of tasks that can be automated.

Potential Challenges of RPA Implementation

Despite the considerable benefits, companies may encounter certain issues when implementing RPA. The biggest problem is the upfront financial investment, which not all companies can afford. This depends on the RPA solution an organization is looking for. For instance, the Microsoft Dynamics 365 ERP and CRM suite comes with tools for automation, while other solutions may require the implementation of separate solutions.

These expenses may include software costs as well as investments in employee training.

Although in many cases companies enjoy significant savings after deploying RPA software, smaller businesses may not receive ROI high enough to benefit from the implementation.

Another potential risk associated with RPA regards security. Automated execution of processes creates more opportunities to overlook vulnerabilities that can lead to data breaches or other types of losses. At the same time, depending on the exact nature of the information being processed, companies must ensure that their RPA systems comply with data security regulations. Enforcing these regulations, in turn, requires additional resources and can generate even higher costs of implementation.

RPA may also require changes in work routines. To successfully deploy automation, employees must adapt to the changes. While this requires the training mentioned above, there is still the possibility that some may not be ready to adapt to new workflows. This can slow down the adaptation of the system and can disrupt some of the business operations.

Source link

The Most Common Problems Related to Automated Testing

0
The Most Common Problems Related to Automated Testing

More and more companies are deciding to move to automated testing because it is an effective way to reduce the cost generated by manual labor, speed up software adoption, and reduce the likelihood of human error. Although automated testing itself does not require as much effort as manual testing, there are certain challenges that many companies face, especially during the implementation process.

Selection of the Right Tool

In order to automate testing, companies need to decide which tool they want to use for this task. The range of options is quite large, which makes it challenging to make the right decision. In addition to the financial factor, which will be discussed in the next point, companies should also carefully consider the desired features.

To reduce human involvement to a minimum, no-code solutions can be an optimal choice, as they do not require advanced technical knowledge to create and maintain test cases. Implementing such tools is also easier and faster than solutions distributed in the form of libraries that can be used for creating a custom testing tool.

At the same time, custom testing solutions allow companies to automate different types of tests, while most low-code and no-code software products are suitable only for functional testing. However, there are exceptions such as Executive Automats, which is also a powerful tool for performance testing.

Read more on: https://www.executiveautomats.com/top-5-automated-testing-concerns-in-ms-dynamics-365/

High Upfront Investment Cost

Free test libraries such as Selenium seem to be the most obvious choice for many enterprises, but such tools generate other expenses that should be considered. Despite the lack of licensing fees, such tools involve a high upfront investment required for setting up and maintaining an infrastructure, which cannot be done without professional developers.

Creating and maintaining tests for Selenium-based solutions also needs expert assistance. While this itself is associated with high cost, working with Selenium also requires more time than working with low-code or no-code solutions, which in turn increases expenses even more.

While no-code solutions such as Executive Automats require an upfront investment, they can provide a higher ROI in the long run due to reduced requirements for supervision from professional testers and simplified implementation.

Companies should evaluate the scope of testing, budget, and level of technical preparation before making a final decision.

Unrealistic Expectations

While test automation offers significant benefits to organizations that rely on third-party software, such as ERP and CRM suites, it is not a magic solution that can meet all testing requirements of a company.

Some unrealistic expectations are related to the problems mentioned earlier in this article. One of these is the belief that automated testing provides complete test coverage. In many cases, much of the work can indeed be automated, but for particularly complex test scenarios, as well as certain types of tests that require a human, automation is not helpful. User experience testing is one of them.

There is also a common myth about automation freeing companies from the need to maintain tests. Test automation does not mean that tests do not need to be updated when the tested software changes. Enterprises that use ERP software such as Microsoft Dynamics 365 Finance and Supply Chain Management should consider this fact since this system undergoes frequent modifications.

In addition, many companies would like to switch to automated testing because they believe that it can detect any bug. Even though automation increases the level of accuracy, it still does not guarantee that no errors will occur. Some problems may remain undetected and require manual testing for further investigation.

Ineffective Strategy

Choosing an inadequate strategy for test automation is particularly common among organizations with unrealistic expectations. At the same time, it is a challenging task for any business.

Again, it is necessary to set clear expectations for the test automation project in order to choose the right approach to automated testing. For instance, some companies looking for ways to automate testing of their ERP and CRM systems may be seeking cost reduction, while others are looking for a solution to minimize the testing time required to roll out updated software. Furthermore, there are organizations that want to automate testing to increase the quality of their software and reduce downtime or other costly consequences of compromised performance.


Source link

Benefits Of Using A Multi-Cloud Strategy

0
Benefits Of Using A Multi-Cloud Strategy

A multi-cloud strategy means a company has intentionally used different public cloud solutions to store its data. Amazon Web Services, Google Workplace, and Microsoft are some of the most popular public cloud vendors that companies can choose from when using this strategy. Companies do this because of the various benefits they stand to enjoy. These benefits of a multi-cloud strategy include the following:

1.     Security

This strategy enables companies to enjoy security when storing data. They can transfer some workloads to public (Infrastructure as a Service) Iaas providers with security benefits. They also have the choice to transfer files between cloud platforms while waiting for services to resume after a phishing or brute force DDoS attack by hackers.

The vendors also provide data backup and recovery services in case of data loss due to natural disasters, power outages, and malfunctioning disks. Additionally, it helps to reduce factors that affect performance, such as packet loss, jitter, and latency which usually happen when a company moves between server and network to another. This way, the multi-cloud approach makes your business more resilient.

Working with cloud services – illustrative photo. Image credit: Sigmund via Unsplash, free license

2.     Boosting Performance

When choosing a cloud services provider, a company considers whether the Iaas can fulfill performance needs, affordability, and location. With this strategy, companies can have a fast infrastructure that maximizes application performance and lowers the cost of merging cloud services with their current IT network. When companies spread networks to various cloud providers, companies can create networks that improve user experience and response time.

3.     Compliance

Most countries have strict governance and data privacy rules, including GDPR and CCPA, which mandate clients’ data be kept in specific locations. With this strategy, companies can meet this requirement without the hassle of creating and operating their in-house data centers.

4.     Flexible and Scalable

As a business grows, its amount of data also increases significantly. Therefore, organizations that want to store and analyze their data can use several cloud providers. This allows businesses to increase or reduce their storage needs when the need arises.

5.     Customized Strategy

A company that uses multiple cloud services can choose the provider that best suits its needs. The advantage of doing this is that an organization is not forced to adjust its functions to comply with a provider’s specifications. The business has the freedom to use different providers that are most compatible with every aspect of the company and its needs.

6.     Eliminate The Risk Of Lock-Ins With A Single Vendor

By using one vendor, you will become tied to them, and it will be hard to change your applications in the future when you desire to alter them. Even though one vendor is ideal for you at a specific time, it might not be suitable when you realize you need to scale your storage needs.

Additionally, you could lose an opportunity to grab some fantastic deals that arise in the future. When you choose a multi-cloud approach from the start, your developers can make apps that operate on different platforms. Therefore, you will be flexible and able to capitalize on the best capabilities and processes from various providers and still be able to deliver the quality you promised your customers.

Final Thoughts

Data is crucial for any business in this day and age. Organizations do their best to protect the data and ensure their IT systems run smoothly. Adopting the multi-cloud strategy is an excellent way to achieve these vital organizational goals.


Source link