Page 1,623«..1020..1,6221,6231,6241,625..1,6301,640..»

Hardware Encryption Market 2022-2027: Growing Rapidly with Latest Trends and Future scope with Top Key Players- Kanguru Solutions, Toshiba, Western…

Hardware Encryption Market Study 2022-2027:

Hardware Encryption Market (Newly published report) which covers Market Overview, Future Economic Impact, Competition by Manufacturers, Supply (Production), and Consumption Analysis, and focuses on various products and other market trends.

The market research report on the global Hardware Encryption industry provides a comprehensive study of the various techniques and materials used in the production of Hardware Encryption market products. Starting from industry chain analysis to cost structure analysis, the report analyzes multiple aspects, including the production and end-use segments of the Hardware Encryption market products. The latest trends in the industry have been detailed in the report to measure their impact on the production of Hardware Encryption market products.

Get sample of this report @ https://www.marketresearchupdate.com/sample/373571

Leading key players in the Hardware Encryption market are Kanguru Solutions, Toshiba, Western Digital, Netapp, Maxim Integrated Products, Kingston Technology, Gemalto, Seagate Technology, Samsung Electronics, Winmagic, Micron Technology, Thales

Results of the recent scientific undertakings towards the development of new Hardware Encryption products have been studied. Nevertheless, the factors affecting the leading industry players to adopt synthetic sourcing of the market products have also been studied in this statistical surveying report. The conclusions provided in this report are of great value for the leading industry players. Every organization partaking in the global production of the Hardware Encryption market products have been mentioned in this report, in order to study the insights on cost-effective manufacturing methods, competitive landscape, and new avenues for applications.

Product Types:AESRSA

On the Basis of Application:Consumer ElectronicsITTransportAerospaceMedicalFinancial ServicesOther

Get Discount on Hardware Encryption report @ https://www.marketresearchupdate.com/discount/373571

This report also consists of the expansion, mergers, and acquisitions, and price, revenue, and production. This report also provides revenue, CAGR, and production shares by the manufacturer.

1) The varying scenarios of the overall market have been depicted in this report, providing a roadmap of how the Hardware Encryption products secured their place in this rapidly-changing marketplace. Industry participants can reform their strategies and approaches by examining the market size forecast mentioned in this report. Profitable marketplaces for the Hardware Encryption Market have been revealed, which can affect the global expansion strategies of the leading organizations. However, each manufacturer has been profiled in detail in this research report.

2) Hardware Encryption Market Effect Factors Analysis chapter precisely gives emphasis on Technology Progress/Risk, Substitutes Threat, Consumer Needs/Customer Preference Changes, Technology Progress in Related Industry, and Economic/Political Environmental Changes that draw the growth factors of the Market.

3) The fastest & slowest growing market segments are pointed out in the study to give out significant insights into each core element of the market. Newmarket players are commencing their trade and are accelerating their transition in Hardware Encryption Market. Merger and acquisition activity forecast to change the market landscape of this industry.

This report comes along with an added Excel data-sheet suite taking quantitative data from all numeric forecasts presented in the report.

Regional Analysis For Hardware EncryptionMarket

North America(the United States, Canada, and Mexico)Europe(Germany, France, UK, Russia, and Italy)Asia-Pacific(China, Japan, Korea, India, and Southeast Asia)South America(Brazil, Argentina, Colombia, etc.)The Middle East and Africa(Saudi Arabia, UAE, Egypt, Nigeria, and South Africa)

View Full Report @ https://www.marketresearchupdate.com/industry-growth/hardware-encryption-market-scope-and-overview-2022-2027-373571

Whats in the offering: The report provides in-depth knowledge about the utilization and adoption of Hardware Encryption Industries in various applications, types, and regions/countries. Furthermore, the key stakeholders can ascertain the major trends, investments, drivers, vertical players initiatives, government pursuits towards the product acceptance in the upcoming years, and insights of commercial products present in the market.

Lastly, the Hardware Encryption Market study provides essential information about the major challenges that are going to influence market growth. The report additionally provides overall details about the business opportunities to key stakeholders to expand their business and capture revenues in the precise verticals. The report will help the existing or upcoming companies in this market to examine the various aspects of this domain before investing or expanding their business in the Hardware Encryption market.

Our Recently Published Article:

https://tealfeed.com/usa-meetings-events-market-development-strategy-3yifu

https://tealfeed.com/europe-explosive-charging-equipment-market-price-a3q2w

https://tealfeed.com/usa-m-phenylene-diamine-mpd-cas-pg6iu

https://tealfeed.com/usa-simultaneous-localization-mapping-technology-market-j5a11

https://tealfeed.com/usa-metal-gear-component-industry-future-2by4n

https://tealfeed.com/europe-liquid-ammonium-liquid-potassium-thiosulfate-cf9jp

This Press Release has been written with the intention of providing accurate market information which will enable our readers to make informed strategic investment decisions. If you notice any problem with this content, please feel free to reach us on [emailprotected]

See the article here:
Hardware Encryption Market 2022-2027: Growing Rapidly with Latest Trends and Future scope with Top Key Players- Kanguru Solutions, Toshiba, Western...

Read More..

Secrets at the Command Line [cheat sheet included] – Security Boulevard

If developers all share one thing in common, it is their use of the command line to get their jobs done. Many development tools dont come with a graphical user interface (GUI) and rely on a command line interface (CLI). There are a lot of upsides to a CLI first or only approach. Once you master the command line, you can work more efficiently than a GUI might allow and gain the awesome superpower of scripting, allowing all of your tools to start working in concert. Scripting is the bedrock for building and managing software delivery pipelines and CI/CD workflows.

However, there is nothing more intimidating, especially to a newer dev, than a blank terminal window with a blinking cursor awaiting your commands. There is no helpful UI to guide you towards your goal; you just have to know what to enter. All the burden of getting it _just right_ falls on the developer's shoulders, and there is a lot to learn, especially when you factor in security.

One area that often gets neglected while mastering the command line is local security around credentials, or what we like to call secrets. While it might feel like secrets management is an area reserved for code repositories, run time environments, and the CI/CD pipelines that drive modern application delivery, good security practices start at home. Home for devs means the terminal.

The first step toward securing secrets in the command line is taking inventory of what secrets might exist. Secrets are any sensitive data that we want to keep private that would grant access to systems or data, what you will see referred to as digital authentication credentials. Secrets fall into a few broad categories: Passwords, Keys, and Certificates

Since the beginning of computing, passwords have played a large part in security, and are synonymous with how a user can log into a system. The term user name is almost always paired with the word password when either is used. Passwords are something you know to prove who you are.

Keeping your passwords secure is as important as having them in the first place. No one would think it is a good idea to write passwords on a post-it note and stick it on their monitor, but a lot of developers are guilty of storing their passwords in plaintext in local files. If someone gets access to your local machine, a quick search of the contents can reveal any such document and start pretending to be you.

In an ideal world, you would simply keep all your passwords securely inside your head. But in reality, we all have way too many passwords for that to be a viable strategy. Fortunately, there are many approaches for securing passwords locally, and we will dig into those later in this article; for now, lets keep going with making our inventory of local secrets.

Keys serve the same basic function as passwords, granting access to systems and data, but differ in several key ways. Passwords are generally generated by humans, for human access to a system or data. They are generally shorter, and ideally, you can memorize them.

Keys are typically generated by an algorithm and are generally much longer and more complex than passwords. Overall, they are not meant to be manually entered, nor are they intended for human access to a system; keys are meant to grant machines and processes access to another system or unlock encrypted data. Another way to say that would be keys are used to lock and unlock cryptographic functions such as encryption or authentication.

One type of key you are likely very familiar with as a developer is your SSH key. This is a prime example of a paired public/private key system at work. SSH public keys are intended to be shared with any remote system that applications your local machine will need to access, such as GitHub or AWS. A corresponding private key is stored on your local file system and should never, ever be shared with anyone. When used together, these keys ensure that access is granted only to trusted systems. This is a very secure approach.

Just like with passwords, the security benefits of keys come with the overhead of needing to secure them locally. While it is unlikely you would hand type a plaintext key into a system, or write it down on paper, as we will see in the next section, there are multiple ways keys can become exposed if we are not careful.

Certificates are a way to store, transport, and use keys. Certificates contain keys, as well as important metadata, such as the issuer, what the certificate is for, and a signature intended to verify the authenticity of the certificate.

While SSL or TLS certificates might spring to mind as a primary use case for security certificates, an increasing number of applications and platforms, leverage certificate-based authentication. Identity management services, like Active Directory, offer integrations that make it easier to leverage certificates to better control access rights for users, while freeing the users from needing to manually manage passwords.

Where and how you store certificates locally might not be obvious, especially on machines provisioned by central IT departments. It is still the developer's responsibility to be aware of what certificates are on your local machine and to ensure the right safeguards are in place to prevent them from being inappropriately shared. It is also important to not expose the keys these certificates hold, as that is a potential threat as well.

Now that we know what kinds of secrets we are looking for, the next logical question is how might they be exposed to a bad actor or malicious code? Secrets can be exposed from your laptop can be compromised, and any plaintext files with passwords or stored keys can be stolen. But there is also the possibility that any server you connect to via SSH might also be unexpectedly accessed. The SSH credentials themselves might mean someone can cause problems while pretending to be you!

Keeping command line secrets safe in any situation where you are using a shell will help you stay safe. Lets take a look at some of the ways you might be exposing secrets.

Credential files are a way to store secrets safely away from any directories that get version controlled. You can set permissions for these files easily with chmod and can programmatically access contents. You can even manage a separate file for each credential, making it harder for any intruder to gather them and limiting the scope of their attacks. These do carry the risk of having credentials in plaintext, but we can address that, and any file, with good encryption.

Entering passwords or keys into a CLI prompt is necessary from time to time. The danger here comes from the fact that anything entered into the terminal in plaintext means it is stored in plaintext in your terminals history. All shells store your history, but to help put this in perspective, if you are running Bash or Zsh, your entire shell history is stored in a file called either .bash_history or .zhs_history. If you go examine that file, you might be surprised at the number of lines it contains. Anyone who gains access to your machine or a shared terminal environment would quickly be able to find any credentials entered directly into a shell by you as a user.

Fortunately, most applications have ways to safely pass credentials without entering them as plaintext. If you encounter an application that needs plaintext credentials, you should consider one of the approaches we outline in the next section. Most of the time, you can still work safely and never expose a credential. If you work with a tool that in no way allows you to pass credentials in a safe way though, it might be time to have a conversation with your security team about the tool in question.

Just like with your Bash history, logs can expose any and all secrets that are stored in plaintext or loaded in an insecure way. Arguably, log files are less secure than your Bash history, given that logs are publicly visible in /var/log and might be accessed by unexpected actors.

Piping credentials between locations is overall a very secure way to handle secrets assuming you are not calling the special /dev/stdin file along the way. Shells like Bash that use stdin (Standard Input) automatically store any input into a file that is accessible by any other process on your machine.

If you go to the terminal right now and just try to print that file with cat, a curious thing happens. Anything that you type is immediately printed to the screen upon hitting enter. Why? Your shell is concatenating (cat) the contents of the file at /dev/stdin to the standard output ( stdout) of the terminal. If there is malware or spyware on your computer, or if someone has injected code into your scripts, it is possible that they can intercept the plaintext contents of this file even if you securely loaded a password or key directly into stdin from a secure source.

Process Status is a utility in Bash and derivative shells that you invoke with the command ps. This utility provides information about processes running in memory and is very important for understanding what your system is doing. For UNIX-like OSes, any value, including the contents of private key files, can be seen via ps when these commands are running and it stores them locally in a file, /proc/

/cmdline, which is globally readable for any process ID, (pid). This can become an especially dangerous situation on machines with shared access, such as remote VMs or servers.

While local credentials management might feel daunting, there are a number of approaches and tools that can help you work more safely and with more confidence in your day-to-day duties. While we are going to cover some of them here, we recognize there are likely more tools and tips that can address this issue; we invite your thoughts on this by letting us know on social media or our contact form.

You might already be familiar with password managers through your internet browsers. Just as providers like LastPass, 1Password, or DashLane, have made managing logging into web interfaces a lot simpler, there are plenty of tools out there that can help us store and manage our passwords safely for use on the command line.

One of the best examples of such a solution is Hashicorp Vault. They have great documentation on how to leverage Vault for programmatically calling any needed credential without exposing it in plaintext. If a bad actor gets their hands on your code, they will see calls to Vault rather than the keys themselves, making it much harder to cause you any harm.

There are plenty of alternatives to Hashicorp Vault as well, such as KeePass, Azure Key Vault, Keeper Password Manager, and Akeyless Vault Platform, just to name a few. They all offer their own idiosyncrasies, but as long as they keep your passwords secure and out of plaintext, then we encourage you to adopt one as soon as you can. Your IT and Security teams likely already have some approved password managers you can start using right away.

While we often think of encryption while transporting data, it is equally, or perhaps even more important at times, to address encryption of data at rest. Any sensitive data that we do not want to expose should be encrypted when we are not actively working with it. Admittedly this does carry the overhead of having yet another encryption key to manage. However, combined with a good password manager, you will be surprised how easy it can be to keep credentials unusable by bad actors.

While there are a lot of possible ways to employ encryption, here are three types of tools we think are useful.

If a bad actor accesses your computer, you will want to make it as hard as possible for them to actually do anything with your data. That is the core idea behind local filesystem encryption; when you are logged out, the data on your machine gets encrypted and becomes unusable to anyone else.

There are a lot of different options out there for Linux systems derived from LUKS, Linux Unified Key Setup. In fact, when installing most distributions of Linux, you will be prompted to enable this by default.

Windows users can leverage tools like Microsofts BitLocker, Folder Lock, or free open source tools like VeraCrypt,

If you are using a newer Macbook with the T2 security chip integrated, good news, you already have a very sophisticated encryption tool ready to use; FileVault.

SOPS name derives from the term, Secrets OPerationS. Unlike local file encryption schemes, SOPS is an encrypted files editor, created by the team at Mozilla. The idea is to remove the manual steps of decrypting highly sensitive files, editing, and then re-encrypting them. Instead, SOPS offers an editing experience that keeps encryption in place throughout the editing process. When opening SOPS encoded files with other text editing tools, the structure of the file is preserved, but any sensitive data is protected.

SOPS is highly customizable and allows you to choose from multiple encryption mechanisms like GPG or Hashicorp Vault, making it easy to fit your workflow. It is a free and open source tool. There is even a VS Code extension available.

Shellclear is a cross-platform shell plugin that promises a simple and fast way to secure your shell commands history. It works by

It is free and open source and very customizable. While this is a newer project, we think it is an elegant solution to finding what secrets are in your Bash history and helping you clean them out.

These are just a few options for securing data at rest on your machine. There are a lot of other tools and methods available. When evaluating any encryption tool, make sure they use a proven, known encryption algorithm. This is definitely one area where you do not want to write your own encryption scheme. Talk to your security folks about other options and tools that they might already have approved.

In simplest terms, environmental variables are the settings the terminal uses to set behaviors such as time formatting or local UTF encoding. By default, these are only accessible to you as the user and the system at runtime.

Environment variables can also be used to store credentials locally, especially for systems called programmatically. Overall this is a safe approach as they are stored safely in the system and can easily be invoked in scripts. However, there are a few things to be aware of:

TLDR: env vars are okay within a limited scope (local machines and process-specific + rotated from time to time), but are not okay in cloud environments.

If you have never run the command env in a terminal, the number of variables already there might seem a bit overwhelming. We invite you to pop open a terminal and run it now. If there is anything in there that looks like an API key or bearer token, it is a good idea to ask if it is still needed and clear out environmental variables that are not in use.

While we mentioned using pipes as a potential security threat surface if used incorrectly earlier in this article, we do actually love pipes in general!

Pipes in Bash take the output from one process or application and pass it into another process or tool. Since there are only two ends to a pipe and they exist in memory only for the duration of the communication, they are extremely secure in and of themselves.

The issue around using pipes comes from the temporary storage of items from standard input, stdin. This issue can be solved by moving the input to the pipe close to the source, aka the application that is feeding into stdin.

A good example that illustrates this point can be seen with cURL. When passing data to a request, you first might try to just pass it from the output with -d "$(< /dev/stdin)". Instead, cURL allows you to grab the data directly from the source without writing to stdin at all with -d @-. Thanks to Carl Tashian for writing a very good summary of this pattern on his blog.

Working on the command line means working faster and more efficiently with a wider range of tools at your disposal. However, as we have spelled out here in this article, it also brings a certain overhead of vulnerability when it comes to credentials management.

The good news is you are not alone in this fight to keep your secrets a secret. There are plenty of tools out there that can help keep your credentials safe and secure. To sum up, our advice is to:

We encourage you to have a conversation with your teams about security and see what tools they have already vetted for your organization. They will likely be able to help you identify ways you can work more securely every day while staying productive on the command line.

*** This is a Security Bloggers Network syndicated blog from GitGuardian Blog - Automated Secrets Detection authored by Dwayne McDaniel. Read the original post at: https://blog.gitguardian.com/secrets-at-the-command-line/

View original post here:
Secrets at the Command Line [cheat sheet included] - Security Boulevard

Read More..

CRITICALSTART Announces Enhanced Threat Detection and Response Capabilities to Support Microsoft Defender for Servers – PR Newswire

New service offering is the company's first threat detection and response solution to support the Microsoft Defender for Cloud product portfolio

PLANO, Texas, Oct. 5, 2022 /PRNewswire/ -- Today, Critical Start, a leading provider of Managed Detection and Response (MDR) cybersecurity solutions, announced the upcoming availability of its MDR service offering for Microsoft Defender for Servers, part of the Microsoft Defender for Cloud product portfolio. The new service will bring Microsoft customers unique capabilities to investigate and respond to attacks on workloads running in the cloud and help stop business disruption.

As business growth demands increase, enterprises are continuing to recognize the many advantages gained by adopting cloud computing services. The benefits include greater agility, lower infrastructure costs, faster deployment and superior availability. At the same time, cloud-based solutions have become an easy target for attacks because of their increased exposure to the Internet. In 2021, over 88% of organizations experienced cyberattacks on their cloud-native applications and infrastructure.1

Cloud Workload Protection (CWP) solutions, like Microsoft Defender for Cloud, bring security teams visibility and integrated threat protection across cloud workloads with automated security to detect and stop suspicious activity. These same security teams have the overarching challenge of being able to properly deploy, manage and optimize the solution as business needs change, in addition to being able to investigate and respond to evolving attacks before they disrupt business.

The Critical Start MDR service, working alongside Microsoft Defender for Servers, will empower security administrators by helping them monitor, investigate and respond to security alerts and incidents at cloud speed. The combination of Critical Start's industry-leading Zero Trust Analytics Platform (ZTAP), which can auto-resolve false positives at scale, and its human-led monitoring, investigation and response, security teams can maximize performance to identify and contain a breach much more quickly. The Critical Start Security Operations Center can respond on behalf of Microsoft's customers to stop attacks on elastic and ephemeral cloud workloads.

"Utilizing cloud services can provide organizations with tremendous business value, but it is often coupled with a barrage of distinctive security challenges. Microsoft Security Solutions continue to lead the industry at addressing those challenges," said Randy Watkins, CTO at Critical Start. "As a Microsoft Security Design partner, we are excited to further extend our collaboration to address the unique and dynamic needs of our mutual customers and reduce the risk of security incidents in the cloud."

This new offering is part of a robust portfolio of services and solutions Critical Start offers for Microsoft Security. The company also has MDR offerings for Microsoft Sentinel, Microsoft Defender for Endpoint and Microsoft 365 Defender. Critical Start's MDR service for Microsoft Defender for Servers is anticipated to reach general availability in early 2023.

For more information on Critical Start and its solutions, please visit http://www.criticalstart.com/.

About Critical StartToday's enterprise faces radical, ever-growing, and ever-sophisticated multi-vector cyber-attacks. Facing this situation is hard, but it doesn't have to be. Critical Start simplifies breach prevention by delivering the most effective managed detection and incident response services powered by the Zero Trust Analytics Platform (ZTAP) with the industry's only Trusted Behavior Registry (TBR) and MOBILESOC. With 24x7x365 expert security analysts, and Cyber Research Unit (CRU), we monitor, investigate, and remediate alerts swiftly and effectively, via contractual Service Level Agreements (SLAs) for Time to Detection (TTD) and Median Time to Resolution (MTTR), and 100% transparency into our service. For more information, visit criticalstart.com. Follow Critical Start on LinkedIn, @CRITICALSTART, or on Twitter, @CRITICALSTART.

1 - Enterprise Strategy Group - Unifying Security Controls to Manage Security Risk Across Cloud Environment: Helping Customers Efficiently Protect Their Critical Workloads in the Cloud, May 2021

SOURCE CRITICALSTART

See the original post:
CRITICALSTART Announces Enhanced Threat Detection and Response Capabilities to Support Microsoft Defender for Servers - PR Newswire

Read More..

Why Cloud Data Modernization Is Needed, and How to Make It Work – Acceleration Economy

When it comes to data, one fact has endured from the origin of mankind: it is inextricably linked to the decision-making process. The more data that we can include in our analysis, the more we can understand the past and navigate the future effectively.

Practices in the capture and storage of business data often from diverse global sources must evolve in response to the skyrocketing quantity of data that businesses produce and their need to act on it faster than ever. One research firm, Statista, forecasts that there will be 181 zettabytes of data by 2025, up from 97 zettabytes this year. A zettabyte is one billion terabytes. The chart below depicts this growth trajectory.

Companies can only store, manage, and act on data at the required speed by modernizing their data infrastructure. To do so, they need to move past the legacy construct of monolithic systems, which store a single type of data in siloed fashion with no movement of data between them. By modernizing such systems in the cloud, companies enable unification of data with robust new functionality and services that dont exist in legacy systems.

To understand the value of modernizing data in the cloud, its helpful to start with this baseline, data-oriented definition of the cloud, a vast network of remote servers hooked together and meant to operate as a single ecosystem. These servers store and manage data, run applications, or deliver content or a service such as streaming videos, web mail, office productivity software, or social media. Users can access files and data from any Internet-capable device making information available anywhere, anytime.

Because of complexity, silos, and the need to have vast amounts and sources of data accessible at high velocity, the need to modernize data infrastructure takes on more urgency every day. Moving data to the cloud is the most compelling option because the cloud will deliver (at least) three critical benefits:

The cloud allows any organization to ingest, analyze and contextualize data at high speed. And we all know that fast decision-making and real-time actions are key to capitalizing on business opportunities in the Acceleration Economy.

In addition, the cloud requires low to no maintenance on the part of the customer, improving security and protection of data and systems, as well as data recovery in case of any threat or incident. This is especially important for highly regulated industries that require large volumes of historical data and regulated compliance by implementing business rules that apply to many systems and tools at once.

There is not a magic recipe for any organization to transition from traditional or monolithic data systems to a cloud data system. That entails moving from a physical infrastructure that has been designed as a reflection of a traditional, hierarchical organization towards something that is more flat, horizontal, and collaborative with fewer boundaries and barriers.

However, there are some cloud data modernization recommendations that should hold true in virtually all industries and use cases:

While the points above are ordered based on a logical sequence, the first point, relating to people, must be addressed at the outset. First, moving to the cloud challenges the status quo (data ownership, silos, org structure) of many organizations. With cloud technology, we are moving from a practice of data to report to decision to a more streamlined practice of data to decision; the implications of this new paradigm can be highly impactful.

Join us on October 27, 2022 for Acceleration Economys Data Modernization Digital Battleground, a digital event in which four leading cloud vendors answer questions on key considerations for updating data strategies and technology. Register for free here.

So, when embarking upon modernization of the data stack, a company should start by educating (or re-educating) the entire workforce, starting from the top of the hierarchy, about being open and transparent, practicing collaboration among teams (which team generates and analyzes specific types of data), delegating more decisions to others, and learning about new technologies and tools. Once the cultural element has been addressed, let engineers and technical people handle the technical aspects of cloud data modernization.

Once migration and modernization have happened, the tech team must stay in close contact with the cloud infrastructure vendor(s) and have a clear understanding about the responsibilities of each party. It is very important to actively monitor cloud performance, storage, and applications usage as well as vendor billing practices known as FinOps. Close internal monitoring of billing, combined with good communication with the provider(s), facilitates solid operational results and keeps the cloud provider(s) fully engaged on your behalf.

There are numerous vendors offering cloud solutions, but again, each and every organization is unique, with a different vision, strategy, and goals. It is easy to understand, therefore, why each vendor is more suitable for certain use cases, industries, and businesses, so a deep understanding of each vendors product offering is critical before adopting one solution over others.

An evaluation of vendor strengths and alignment with your business goals and culture must include:

In the analysis above, Ive focused on the why and how of data modernization in the cloud and shared important technical considerations.

Theres one more critical technology factor to consider, and thats the vendor or partner you select to execute on your data modernization goals. In the table below, Im presenting the companies from my direct, hands-on experience and ongoing engagement that are the best candidates to help you, and some key strengths they offer. These companies, of course, are the subject of ongoing analysis at the Acceleration Economy site.

Looking for more insights into all things data? Subscribe to the Data Revolution channel:

Read this article:
Why Cloud Data Modernization Is Needed, and How to Make It Work - Acceleration Economy

Read More..

Is a 10-Year-Old Facebook Technology the Future of Cloud Security? – Security Boulevard

In the pantheon of semi-obscure open source tools, osquery is one that deserves a closer look from most security professionals. Its easy to see why this old Facebook tool that was originally used to query operating system data has flown under the radar. Initially, it was used to improve the usability of Facebook across different platforms; there were a few individuals, mainly on the west coast of the U.S., who saw a hidden superpower in osquery that could upend the way security is managed. Because osquery lets you query nearly all of an operating systems data like a database with rich, standardized telemetry, it effectively creates an insanely powerful EDR tool. One that gives you broad visibility into exactly what is going on with an OS and ask questions about your security posture. It essentially lets a team with the right know-how perform outsized threat hunting, faster detection and remediation, implement YARA rules and more.

These superpowers created a small but very dedicated user base who were either active users or intrigued by what osquery could do.

But for all of osquerys might, there was a catch that prevented wider adoption. The open source version of osquery required knowledge of SQL and wasnt necessarily that easy to implement as part of a security stack. Also, in an increasingly cloud-native world, the open source version was at first limited to endpoints and was difficult to scale to cloud use. Theres now a version from Uptycs that doesnt require knowledge of SQL, and it is a very powerful tool for securing laptops and other endpoints, Linux servers and more. However, we now live in a cloud-first and cloud-native world. So is osquery still relevant?

Something that will become almost immediately apparent to any adept user of osquery is that it is almost infinitely scalable and flexible. That flexibility means that osquery is free to break out of its traditional domain of laptop endpoints, on-premises Linux servers and data centers and to secure the cloud. At the end of the day, osquery is just a way to query data points in an operating system. With some tinkering, it can be used in cloud environments like AWS, Azure or GCP, in container environments like Kubernetes or even, in theory, with identity providers or SaaS tools.

This flexibility effectively means that this open source tool can be used by organizations to monitor everything from developer laptops to the identity authenticator devs use to sign in to services. It can get structured telemetry from SaaS apps and container instances where code is built and tested and from cloud services where the code is ultimately deployed and run. This can all be done from a single platform using a single tool.

Take a moment to think about how radical of a departure this is for the security community. Were used to buying single-use tools for each environment that each operate in their own silo, and has its own data model and own set of rules. We then try to assimilate them into a stack and use an aggregator like a SIEM to try and pull all of the information together into a single source of truth. If a vendor of one of those products branches into another space, say an EDR vendor that moves into cloud workload, its usually done with a bolt-on acquisition of another company or technology that is often poorly integrated and implemented, and the data is often difficult to access or piece together into a unified picture. Not surprisingly, this way of doing things has led to gaps in visibility, alert fatigue and frustration. This presents obvious challenges when todays high-growth companies are relying on a complex innovation supply chain to produce the code that powers their technology.

The transition to the cloud is only accelerating, but with the industry attention focused on addressing the cloud threats that have been dominating the news, traditional endpoints are getting left behind. No matter how well streamlined your cloud security platform is, if its not including endpoints like developer laptops or on-premises Linux servers, you are giving up crucial visibility into your innovation life cycle. With reduced visibility comes risk.

For many security leaders, osquery flies under the radaror, in some cases, it is not even on the mapas a solution to these problems. But it shouldnt be. The ability of osquery to ingest and structure data so that its almost infinitely queryable is a superpower that can enable security teams to secure their entire ecosystem and future-proof their security stack. No matter what environments or operating systems your organization uses, osquery can help your security teams quickly and efficiently find the questions to almost any security, posture or configuration question. If youre worried about the posture of endpoints, osquery can answer those questions. But it can also answer questions about lateral movement in container pods or misconfigurations in AWS too.

Osquery is an open source tool that has the power to transform how we secure the cloud and makes a strong case for itself as one of the most powerful and flexible security solutions ever created.

Recent Articles By Author

Continue reading here:
Is a 10-Year-Old Facebook Technology the Future of Cloud Security? - Security Boulevard

Read More..

The benefits of cloud-native software in financial services – The Paypers

Michael Backes, Co-Founder and CTO at receeve explores the benefits of cloud-native software especially when it comes to powering collections operations.

Security has always been a delicate topic when it comes to financial technology, especially within the banking industry and partnering with third-party software providers. What are the real risks behind cloud-based? How does cloud-native differ from on-premise alternatives? receeve experts share more.

A collaborative approach to service digitisation cuts both the time and cost of companies by allowing them to focus their critical resources on their core business. They also gain the added benefit of bringing in domain-specific expertise on a specific part of the business, increasing reliability and positive outcomes.

Increasing service digitisation empowers businesses, as its more comprehensive data analysis and reporting improve access to customer insights. This facilitates detailed customer segmentation and offers opportunities to boost service levels through increased digital competence across multiple channels.

In the case of collections operations, businesses can avoid upfront hardware investments and costly ongoing maintenance - with the option to scale up or down to meet current demands. This serves institutions that are looking to focus their resources on their core business. Similarly, companies seeking to arm their teams with a tech stack that enables scalability and independence from IT departments can ensure resource assignment is optimised.

Ultimately, consumers benefit from an improved service offering, allowing them to streamline the products they use and sustain patronage with the companies meeting their needs.

The potential risks that arise from partnering with third-party companies can be many and varied: data exposures, failures in regulatory compliance, the adoption of inadequate security protocols, and more. If not taken into account, these issues can yield significant legal and reputational consequences.

In some instances, risks are increased when vendors outsource elements of their own service to third-party providers. This is because security protocols, levels of transparency, and data protection policies can vary from business to business.

As financial services providers increase their third-party dependence, it becomes essential to identify critical services and ensure effective oversight of both system tolerances and security risks. And since the financial sector is inherently interconnected, with multiple entities across the value chain, businesses must consider central risks before onboarding new vendors, including the threat of data breaches, unauthorised access to internal information, and the disclosure of intellectual property.

Since updates to legal and regulatory frameworks around data access and management are common, it can be a risk to simply assume your third-party vendor is safeguarding your operational and commercial compliance. This is evidenced by the fact that 64% of data breaches are linked to third-party operators - and the average data breach costs businesses over USD 7.5 million to remedy.

To mitigate these potential risks, many companies employ cybersecurity risk management controls that include vetting third-party security practices and establishing data breach and incident report protocols. Unfortunately, these measures are often resource-intensive and costly.

An added consequence of third-party software use is the potential for outages and system failures - often from oversights at the implementation stage - leading to interrupted service for customers. As with data breaches, these gaps in usability can often be reputationally damaging and costly to resolve. Many vendors, therefore, employ a continuous deployment approach, automating the building, testing, and rollout stages of the software delivery process with each iteration.

A large number of third-party software providers choose an alternative methodology, opting for longer production cycles that allow for increased testing prior to delivery, to reduce risk once the product is live.

Though many of these risks are associated with third-party vendors, the development of proprietary software also carries with it many of the same potential pitfalls, requiring ongoing maintenance and robust security systems. To achieve this, large financial outlays are necessary, to ensure ongoing development, support, and maintenance.

As outlined, many businesses conduct rigorous testing and vetting processes to ensure new vendors meet their commercial and operational needs, from a delivery, support, and legal compliance standpoint. Still, third-party companies themselves can shore up security levels by separating sensitive data from primary system infrastructures - ideally with the use of a single-tenant cloud-based environment.

On-premise applications are, as the name suggests, applications that are stored and run at a single premise - with data only being generated, stored, and accessed locally. A primary example would be an office with multiple computers running Microsoft Word. While the application may be installed or run across multiple computers, the files and documents created on a given machine will only be accessible by users logging into the same computer.

Cloud-native applications, on the other hand, maximise accessibility and eliminate reliance on a centralised storage source. They cut out the need for investment in expensive servers and allow for fast scaling, doing away with application developments, system management, and server-to-server integration.

Crucially, cloud-based applications have the added benefit of offering simple, pain-free integrations, since they use APIs to quickly facilitate communication between multiple systems and programs. This ensures your tech stack operates as a single, coherent application, letting you connect multiple tools at the click of a button.

Cost-effectiveness, efficiency, and interoperability are key factors for businesses adopting new technologies. Additionally, with no upfront hardware investments and maintenance costs, collections teams can scale their operations up or down at a moments notice with speed and ease. Better still, cloud-native applications will automatically update and ensure ongoing support as new digital systems become available.

With data-driven, cloud-based applications, businesses can eliminate the stresses of maintaining legacy systems and implementing non-cloud-native software, letting their collections teams focus on essential tasks. This frees up opportunities for staff to refine customer segmentation approaches and develop more successful collection strategies in the long term.

Michael Backes has 20 years in the tech industry as an entrepreneur helping organisations transform their legacy frameworks into digital-first models. In 2019 Michael brought his experience building next-generation financial services to the debt management industry and co-founded and launched receeve GmbH, a cloud-native solution for the collections & recovery industry. receeve is venture capital funded and growing the team aggressively in the EU & LatAm markets. receeve transforms debt management with a comprehensive data layer and ML/AI helping internal teams recover more by optimising processes, strategies, engagement, and asset management.

Read more:
The benefits of cloud-native software in financial services - The Paypers

Read More..

Why Are QuickBooks in the Cloud the Future of Accounting? – Business Review

Are you still using an outdated accounting system costing you time and money? If so, you need to switch to QuickBooks in the Cloud.

QuickBooks in the Cloud is the future of accounting. Cloud-based software is becoming increasingly popular because it is accessible from anywhere, making it a convenient choice for business owners. Cloud-based software is also more secure than traditional software, which can be prone to data breaches.

QuickBooks in the Cloud is an excellent choice for businesses that want to save time and money while keeping their data safe and secure.

QuickBooks in the Cloud is a cloud-based accounting solution that allows businesses to manage their finances online. QuickBooks in the Cloud is an excellent choice for businesses of all sizes because it is affordable, easy to use, and allows users to access their data from anywhere.

There are many reasons why QuickBooks in the Cloud are the future of accounting.

QuickBooks in the Cloud is a convenient choice for businesses that want to save time and money. QuickBooks hosted in the cloud can be accessed from any internet-connected device, making it a suitable choice for business owners who travel frequently or have employees who work remotely.

One of the most significant advantages of QuickBooks in the Cloud is that it allows users to work from anywhere at any time. You only need an internet connection and can access your QuickBooks files from your laptop, tablet, or smartphone. This increased flexibility and mobility will allow you and your team to be more productive both in and out of the office.

QuickBooks in the Cloud also allows for better collaboration between team members. For example, lets say youre working on a clients file and need to send it to your manager for review.

With QuickBooks in the Cloud, you can easily share files with anyone, regardless of location. This collaborative environment will help streamline your workflow and improve communication within your team.

Businesses that use QuickBooks in the Cloud can save time and money in several ways.

Another great benefit of QuickBooks in the Cloud is that it is accessible from anywhere. QuickBooks in the Cloud can be accessed from any internet-connected device, making it a convenient choice for business owners who travel frequently or have employees who work remotely.

Regarding sensitive client data, security is always a top concern for accountants. With QuickBooks in the Cloud, your data is stored on secure servers that are backed up regularly. In addition, you can control who has access to your files and set permission levels accordingly. This enhanced security will give you peace of mind knowing that your clients data is safe and secure.

Many other features make QuickBooks in the Cloud a good choice for businesses.

QuickBooks in the Cloud is the future of accounting because they offer many benefits that traditional accounting software does not. QuickBooks in the Cloud is more affordable, convenient, and secure than traditional software.

In addition, QuickBooks in the Cloud can be customized to meet the specific needs of businesses. QuickBooks in the Cloud is a good choice for businesses of all sizes.

The cloud has revolutionized businesses by providing them several advantages previously unavailable with traditional on-premise software. QuickBooks in the Cloud, in particular, is an excellent choice for businesses because they are accessible from anywhere, more secure than traditional software, and can be customized to meet the specific needs of businesses.

In addition, QuickBooks in the Cloud offers flexible pricing plans and automatic backups. As a result, more and more businesses are choosing to use QuickBooks in the Cloud as their accounting system. QuickBooks in the Cloud is the future of accounting.

Read more here:
Why Are QuickBooks in the Cloud the Future of Accounting? - Business Review

Read More..

Dell has Liqid route to CXL memory pooling Blocks and Files – Blocks and Files

Dell has shown how its MX7000 composable server chassis can be used with Liqid technology to add PCIe gen 4-connected GPUs and other accelerators to the composable systems mix, with an open road to faster still PCIe gen 5, CXL, and external pooled memory.

The four-year-old MX7000 is an 8-bay, 7RU chassis holding PowerEdge MX server sleds (aka blades) that can be composed into systems with Fibre Channel or Ethernet-connected storage. The servers connect directly to IO modules instead of via a mid-plane, and these IO modules can be updated independently of the servers. Cue Liqid upgrading its IO modules to PCIe gen 4.

Liqid supported the MX7000 from August 2020, with PCIe gen 3 connectivity to GPUs etc. viaa PCIe switch. Kevin Houston, a Dell principal engineer and Field CTO, writes: The original iteration of this design incorporated a large 7U expansion chassis built upon PCIe Gen 3.0. This design was innovative, but with the introduction of PCIe Gen 4.0 by Intel, it needed an update. We now have one.

He showed a schematic of such a system:

The MX7000 chassis is at the top with eight upright server sleds inside it. A Liqid IO module is highlighted; a PCIe HBA (LQD1416) wired to a Liqid 48-port PCIe gen 4 fabric switch. This connects to a Liqid PCIe gen 4 EX-4400 expansion chassis which can hold either 10 Gen 4 x 16 full height, double wide (EX-4410) or 20 Gen 4 x 8 full-height, single wide (EX-4420) accelerators

The accelerator devices can be GPUs (Nvidia V100, A100, RTX, and T4), FPGAs, SSD add-in cards or NICs.

Houston writes: Essentially, any blade server can have access to any [accelerator] device. The magic, though, is in the Liqid Command Center software, which orchestrates how the devices are divided up over [PCIe].

Liqids Matrix software allocates accelerators to servers, with up to 20 GPUs allocated across the eight servers in any combination, even down to 20 GPUs to a single server.

It seems to us at Blocks & Files that this MX7000 architecture and Liqid partnership means that PCIe gen 5, twice as fast as PCIe gen 4, could be adopted, opening the way to CXL 2.0 and memory pooling.

This would require Dell to equip the MX7000 with PowerEdge servers using Sapphire Rapids (Gen 4 Xen SP) processors or PCIe gen 5-supporting AMD CPUs. Then Liqid will need a PCIe gen 5 HBA and switch. Once at this stage, it could provide CXL support and memory pooling with CXL 2.0.

When memory pools exist on CXL fabrics, composablity software will be needed to dynamically allocate it to servers. Suppliers like Dell, HPE, Lenovo, Supermicro etc. could outsource that to third parties such as Liqid or decide that the technology is core to their products and build it, acquire it or OEM it.

CXL memory pooling looks likely to be the boost that composability needs to enter mainstream enterprise computing and support use cases such as extremely large machine learning models. How the public cloud suppliers will use memory pooling, both internally and externally, as memory-pooled compute instances, is an interesting topic to consider.

Continued here:
Dell has Liqid route to CXL memory pooling Blocks and Files - Blocks and Files

Read More..

The 13 Most Promising Cybersecurity Startups of 2022 – Business Insider

The market to catch cyberattackers is hot. And it will only continue to heat up, Allie Mellen, a cybersecurity analyst at Forrester, told Insider.

A report from the data firm Research and Markets suggested the global cybersecurity market was estimated to be worth $173.5 billion in 2022 and could grow to $266.2 billion by 2027.

Businesses are looking for new ways to protect their data. Mellen said companies have become more prone to attacks that can cost them tens of millions of dollars to fix as they move to the cloud. As a result, hackers are developing more sophisticated ways to steal data even legacy tech companies such as Cisco, Nvidia, and Twilio with established internal security measures were victims of attack this year, Mellen said.

In turn,venture-capital firms are pouring billions of dollars into cybersecurity startups to help keep businesses secure. Rama Sekhar, a partner at Norwest Venture Partners, said that as companies invest in more cloud tools, they'll also buy more tools to keep their security up to date.

"A lot of the security companies are now focused on cloud," he told Insider.

Alex Kayyal, a Salesforce Ventures managing partner, said the move to hybrid and remote work made that demand even greater.

"The office has become an infinite canvas of a location, and so security becomes that much more important," he said.

Additionally, VCs told Insider that cybersecurity isn't an area where companies are likely to cut spending in a downturn.

But cybersecurity is a mature industry, dominated by giants like Microsoft, IBM, and Oracle. Mellen said there's room for innovation in helping businesses protect cloud servers, pinpoint vulnerabilities in their data systems, secure internet browsers, and enable employees with no coding expertise to build out their firm's cybersecurity strategy.

Mellen added that businesses aren't adopting the new tools quickly.

Insider asked several VCs to pick the most promising cybersecurity startups both in and out of their portfolios. All company valuations and funding information are according to PitchBook unless otherwise noted.

See the original post:
The 13 Most Promising Cybersecurity Startups of 2022 - Business Insider

Read More..

There’s one important thing about the Pixel 7 we still don’t know – TechRadar

The way a phone feels in your hand is one of the most important factors in buying a phone, and its one of the last things we dont know about the Google Pixel 7. Weve read through leaked specs, watched leaked promotional videos, and pondered the implications of new features. We just havent gotten our hands on one.

If you pick up a Galaxy S22 Ultra and flip it over and over again in your hand, it feels smooth all the way around. It wont catch on your fingers or scratch your skin. The rounded edges are easy to hold, and the phone feels stiff and solid in a way that conveys strength. The phones weight and density make it feel premium.

If you pick up a Motorola Edge (2022), it feels remarkably light. Thats appealing at first, but then you notice the plastic feel of the case. The finish is very attractive and catches the light in an interesting way, as much as a dark grey can be appealing. The feel of this phone inexorably conveys a sense that it belongs in the mid-range. It isnt impressive to hold, but it does have some appeal.

We can quote spec numbers and speculate on software forever, but until we hold the Google Pixel 7 and Pixel 7 Pro in hand, we wont know how exciting these phones will be for potential buyers.

Right now the phones dont seem very exciting, mostly because they dont offer a significant upgrade over the Google Pixel 6 and Pixel 6 Pro. As technology enthusiasts, were always rooting for companies to push the envelope. That said, if youre considering a Pixel 7, you probably have a phone thats older than a Pixel 6, and the Pixel 6 was already a big step forward for Google, especially in terms of design.

The Google Pixel 5 looked like every other boring smartphone on the market: it was a flat slab with some camera lenses tucked into a corner. The Pixel 6, by comparison, is a standout device. Its not just colorful, its polychromatic, and the colors are unique and refreshing in an industry of silver, blue, and depressing purples.

The Pixel 6 houses the cameras in a distinct, black bar across the top of the phone's back. If you see a Pixel 6 in someones hand, you know what phone theyre using. Theres nothing wrong with showing off the device you carry with you everywhere and use all the time. Thats the point.

The other big question is what the new Google Tensor 2 chipset will bring to the devices. Weve heard about a few new features from the leaks, but the Tensor 2 could play a major part in how well these features perform, and how much they impress us.

For instance, weve all used speech recognition when talking to Google Assistant or Apples Siri. When you talk to Google, it loads your speech to Googles cloud and the cloud servers process your speech to understand what you want. With the Tensor platform in the Google Pixel 6, Google moved much of this processing to the phone. This made speech recognition much faster and more efficient (and maybe more private).

If somehow Google has invented a portable universal translator device that works like the magic on Star Trek, well be blown away.

Weve seen mention in leaks of features like macro focus mode in the camera, live language translation features, and even improvements to call quality. All of these presumably use Googles AI magic. Were curious to see if Google brings these features to life through the power of the Tensor 2, much as it moved speech recognition to the chip.

Until we have a chance to get hands-on with the phone, we just wont know how these features perform. If the live translation feature is slow or relies heavily on a network connection, it wont be as useful. If somehow Google has invented a portable universal translator device that works like the magical "Computer" on Star Trek, well be blown away.

In fact, the entire phone experience is still a mystery, because we dont know how well the new platform will drive Googles Android 13 OS. We know about the refresh rate of the screen, but can the Tensor 2 really push the user interface to 120 fps and max out the displays potential? Will the phone stutter when we load up three different mapping apps while scrolling through our TikTok feed? Those are questions we can only answer with the phone in hand.

Maybe the Pixel 7 Pro wont be the fastest phone on the market. Perhaps the new photo features wont blow us away. The leaked photos of the Pixel 7 look an awful lot like a Pixel 6, just maybe a little grown-up.

Were still reserving judgment on the Google Pixel 7 and Pixel 7 Pro until we get our hands on them. The way it feels, the way it performs, these aspects are too important to make judgments in advance.

Well be live at Google event for a complete look at the new phones, as well as the new Pixel Watch, and then well know if the Pixel 7 is the best Pixel phone Google has ever made.

See original here:
There's one important thing about the Pixel 7 we still don't know - TechRadar

Read More..