Big Data

Protecting pharmaceutical data with azure banner

Protecting pharmaceutical data with Azure

Protecting pharmaceutical data with Azure 700 500 Xcelpros Team

At a Glance

  • $985 million: The median cost of getting a new drug into the market
  • $1.3 billion: The newer, lower average cost of getting a new drug to market
  • $2.8 billion: The previous average cost of getting a new drug to market
  • $200 billion: The estimated size of the counterfeit drug market
  • 13: the number of new drugs not brought to market each year because of revenue losses from counterfeit drugs

Sources: Wikipedia and Statista.com

Introduction

On average, the cost of bringing a new medicine from idea to market – aka the drug development process – has dropped significantly, from $2.8 billion per drug to $1.3 billion each, according to an online encyclopedia. Counterfeits still have a measurable effect on the number of drugs being brought to market. Recent studies published on Wikipedia and Statista.com show that prescription drug makers continue to get hammered by counterfeit competition.

Statista’s 2022 study provided interesting data on different scenarios showing changes based on market size. The number of new medicines not brought to market ranged from six at $100 billion to 28 at $431 billion

So, what does all this mean? The short version is big pharma and even smaller companies have a considerable investment in intellectual property (IP) they must protect.

IP and Drug Manufacturers

“IP rights, if sufficiently limited, are typically justified as necessary to allow pharmaceutical manufacturers the ability to recoup substantial costs in research and development, including clinical trials and other tests necessary to obtain regulatory approval from the Food and Drug Administration (FDA),” the Congressional Research Service states (CRS).

Pharmaceutical companies are protected by two types of intellectual property (IP): patents, which give exclusive rights to the holder for 20 years, and regulatory exclusivities. According to CRS, these exclusivities range from six months to 12 years, depending on the specific type of drug or biologic.

These companies have a substantial financial investment in their research, development, and testing data. The only way to recover their vast assets is by making and selling products.

It can cost millions of dollars and over ten years of dedication to developing a single drug. With the money levels involved, thieves have a solid incentive to capture this research for themselves.

Protecting Your IP Investment

Take a look at another statistic: $590 million. That’s the amount the U.S. Treasury Department estimates was paid by victims of 450 ransomware attacks in the first half of 2021 alone.

Using that short time frame alone, the Treasury’s Financial Crimes Enforcement Network (FinCEN) stated that “ransomware is an increasing threat to the U.S. financial sector, businesses, and the public.”

Ransomware is one of many cyber-attacks that share a common goal: stealing money. This attack works by infiltrating a company’s computer network and taking control of it. Companies face a difficult decision: pay the ransom, have their data destroyed, or worse, share it worldwide.

Distributed denial of service (DDoS) attacks is another standard weapon in a hacker’s arsenal. “In computing, a denial-of-service attack is a cyber-attack in which the perpetrator seeks to make a machine or network resource unavailable to its intended users by temporarily or indefinitely disrupting services of a host connected to a network,” a Wikipedia article states.

Many of these attacks start with simple phishing schemes. If they get one employee out of thousands to open an infected email, they have a doorway into your data. Microsoft does everything it can to block these attacks and protect its investment and yours.

As with ransomware and other attacks through stealth or brute force, the goal of DDoS attacks is money. The thinking is, “Hit a company badly often enough, and it will pay you to leave them alone.

Figure 1:Key Strategies to avoid cyber security attacks

Key Strategies to avoid cyber security attacks

Book a consultation to discover more about Azure protection in the pharma industry.

Book Now

Protection in the Cloud

Every company has another option, though: investing in its security.

One method of stealing data involves capturing it as it moves from place to place. Microsoft has invested millions of dollars to continuously protect its customers’ data through its Azure cloud computing platform. The company stated in a recent blog post that they detect 1.5 million attempts per day to compromise its systems, spending about $1 billion a year on computer infrastructure security.

With the ongoing need to invest in protecting their on-premise equipment from attacks, more firms are migrating to the cloud, with platforms like Azure gaining importance with each thwarted attack. Moving data from on-premise network servers to widely spread cloud data centers means attackers have to hit moving targets if they want to control a company’s data.

“The cloud has some built-in advantages. Unlike the internet, it was built from the ground up with modern security and privacy in mind. It’s also a controlled ecosystem protected by people who spend all day thinking about data security and privacy,” according to a recent Microsoft online store.

Traditionally, internet and computer security safeguards were bolted onto a tool rather than built into it. “With cloud infrastructure, security considerations are part of the development process,” Microsoft states. “The cloud is an opportunity to do security better,” security analyst Doug Cahill added.

Azure’s status as a cloud platform means that all of the money, and the 3,500 security engineers Microsoft devotes to making it secure, also benefit the software running on top of it. For example, Microsoft Dynamics 365’s Supply Chain Management, which helps companies track raw materials and finished products from the warehouse to the customer, runs on top of Azure.

Companies using this software get the bonus of automatic protection for their cloud data. While these updates can’t directly help prevent thieves from making a physical attack on their buildings, it can make it harder for them to steal data on the move.

“If we detect a set of attacks on one tenant or a handful of tenants, we can synthesize that and start using the things we learn to protect all the other tenants out there,” Bharat Shah, Microsoft’s vice-president of Security for Azure platform, said. “That’s the cloud effect. We learn. We react. We turn something on, and we protect everybody else.”

Azure benefits from Microsoft’s investment in machine learning – a branch of artificial intelligence – to track attempted attacks. Microsoft takes what it learns and uses it to benefit not just Azure but all the companies who’s multi-million-dollar intellectual property investment rides on top of it.

The Bottom Line

Citing the NETSCOUT Threat Intelligence report, Forbes.com estimated 26,000 cyber-attacks per day, or 18 per minute, in 2020 alone. The report indicated that security threats against industrial control systems and operational technology tripled in 2020, while DDoS attacks will grow to 15.4 million by 2023.

These numbers should make any executive who doesn’t have a significant cyber security team on their staff nervous. Thankfully, companies who use Microsoft Azure’s cloud computing platform have the security of more than 3,500 security engineers devoted to protecting it and the data running through it.

With the livelihood of pharmaceutical companies depending on keeping their data safe, secure, and private, you don’t risk your company’s data with poor security. Investing in Azure services today can make a huge difference in your bottom line.

ERP and Big Data

ERPs Make Big Data and Big Business a Good Match

ERPs Make Big Data and Big Business a Good Match 700 500 Xcelpros Team

Introduction

What does “Big Data” mean to you and your company? To many, the phrase means large quantities of information from different sources, data that changes by the second. For example, it can refer to the temperature of a chemical process where a small variation makes the difference between good product and wasted materials.

A common big data definition is, “a collection of data that is huge in volume yet growing exponentially with time,” guru99.com states. The “4 Vs of Big Data” are:

Figure: 14 Vs of Big Data

4 V's of BigData

  • Volume, in terms of data coming from multiple sources at the same time
  • Variety, which can be flow volumes, temperatures, production costs and other information calculated separately
  • Velocity, referring to the speed of information from application logs and device sensors (IoT)
  • Variability, data flows when a machine is running and stops when the production cycle ends

“Big Data” can also refer to lines from sales contracts referencing products, volumes and/or quantities from several customers. From a supply chain perspective, those same sales numbers require raw materials plus labor and machine operating time to produce them.

In the past, “Big Data” often referred to information from one department such as Production or Sales. One of the biggest challenges with big data is providing information siloed in one department to other areas that need it.

There are even challenges to searching big data, which includes getting results based on the query. When the query isn’t phrased correctly, or a required document has a naming error, important information is left out.

A key challenge is overwhelming volume.

  • The New York Stock Exchange generates one terabyte of data each day
  • Facebook cranks out more than 500 terabytes of customer-uploaded photos and videos every day
  • A jet engine generates more than 10 terabytes of data in 30 minutes of flight

By the Numbers

Many businesses are drowning in data, not all of which is useful.

  • 8%: the number of businesses using more than 25% of internet of things (IoT) data available to them
  • 10% – 25%: Marketing databases containing critical errors
  • 20% – 30%: Operational expenses directly tied to poor data
  • 40%: The growth rate of corporate data with a study by SiriusDecisions finding organizational data typically doubles every 12-18 months
  • 40%: the number of businesses missing business objectives because of poor data quality
  • $13.3 million: The average annual cost of poor data quality

Big Data Costs

Big Data comes with costs, especially for in-house networks. Once data is obtained, it gets stored before being analyzed. Data is usually backed-up in case something happens to damage, destroy or in the case of hacking, hijack it.

The actual costs of this data varies based on business size and need. Estimates place the lowest range at $100 – $200 per month to rent a small business server. Installation costs typically start at $3,000 and go up from there.

Big Data includes up-front as well as hidden costs. Up-front costs most people see consider includes:

  • Software tools to manage and analyze data
  • Servers and storage drives to hold the data
  • Staff time to ensure the physical devices work properly and to manage the data

These costs scale proportionally depending on the business’ storage and retrieval requirements and the processing power required to gather the data.

Hidden costs usually refer to the bandwidth needed to move data from one source or site to another. While we might consider it a simple task to download a movie on a cellphone, moving terabytes of data between servers can be significantly more expensive.

Accurately estimating big data costs is basically impossible without a detailed look at each company’s specific requirements and needs. However, online research estimates them to be anywhere from several hundred dollars per month for a small business to tens of thousands of dollars per month or more. Infrastructure costs alone can easily top $1,000 – $2,000 per terabyte (TB) with qualified outsourced consultant pricing averaging between $81 – $100 per hour.

Big Data Limitations

Having access to large volumes of data is great – when a company knows what to do with it. Especially when servers are in-house, big data has its limitations. These problems include:

  • Software tool compatibility, such as different types and brands of databases
  • Correlation errors, such as linking incompatible or unrelatable variables
  • Security and privacy from the standpoint of only exposing your data to the eyes of qualified people

From a mechanical perspective, one industrial device might use a Siemens programmable logic controller (PLC). Another device can use a Rockwell PLC and a third could be from Mitsubishi Electric. These different devices add additional layers of complexity.

Using supervisory control and data acquisition (SCADA) architecture is one way some larger companies are resolving PLC compatibility issues. SCADA includes computers, networked data communications and graphical user interfaces.

Figure: 2Big Data Limitations

Big Data Limitations

Resolving Big Data Issues

One way pharmaceutical companies can resolve rising big data issues, especially those caused by using older, legacy systems is with a modern ERP. Enterprise resource planning software such as Microsoft Dynamics 365 (D365) resolves many of these incompatibility issues.

Data integration is a major big data problem for companies that use one database in production and another in finance.

D365’s data integrator is a point-to-point integration service used to integrate data. It supports integrating data between Finance and Operations apps and Microsoft Dataverse. The software lets administrators securely create data flows from sources to destinations. Data can also be transformed before being imported.

Dual-write—a related D365 function—provides bi-direction data flow between documents, masters and reference data.

This type of data collection raises potential ethical issues when accessing large quantities of personal information, which could include contact information for patients enrolled in a new drug study.

Installations by professionals experienced in working with pharmaceutical companies can organize data and help strip out personal information. Removing it reduces the chance of a HIPAA (health insurance portability and accountability act) violation.

Being a cloud-based product, D365 also cuts down many of the personnel costs associated with big data management and maintenance. Microsoft assumes those costs along with the burden of data security.

Conclusion

Having a lot of information lets companies make accurate, informed decisions. Problems crop up when data is kept in departmental silos. Using an ERP to integrate information across departments removes many barriers to sharing information, which leads to more accurate sales and inventory predictions, reducing overall costs and boosting profits.

Boost Your business ROI with Microsoft Dynamics 365 and manage big data efficiently. Book a consultation to learn more.

Book Now