Page 2,676«..1020..2,6752,6762,6772,678..2,6902,700..»

Five-Story Mural in Budapest to Pay Tribute to mRNA Pioneer Katalin Karik – Hungary Today

A huge, five-story mural depicting Hungarian-American biochemist Katalin Karik will be created on a firewall in Budapest to pay tribute to the scientist who played a key role in the development of the mRNA Covid vaccines.

The original article was written by our sister site,Ungarn Heute.

Next to Kariks portrait will be placed the motto The future is written by Hungarians announced the organizers of the future festival Brain Bar.

The artwork is designed and being created by the talented and experienced team of Colorful City (Sznes Vros), and is also supported by the Hungarian government.

The giant mural on Krisztina Boulevard will be completed by the launch of this years Brain Bar between September 9-10.

According to organizers Brain Bar aims to strengthen the ambitions of Hungarian youth and encourage them to actively shape the future. Year after year we bring role models to the festival who can inspire them to do this, but sometimes the role models are walking among us.

In addition to her academic achievements, Katalin Karik is doing Hungary a great service by proving to the next generation that we have a place among those shaping the 21st century, said Gergely Bszrmnyi-Nagy, the founder of Brain Bar.

Related article

Every Independence Day, the Carnegie foundation honors the contributions of immigrants who have strengthened the American nation through the lives they live and the examples they set.Continue reading

The Hungarian biochemist, who emigrated to the United States in 1985, contributed significantly to the fight against the coronavirus as vice president of the Mainz-based company BioNTech by co-developing mRNA technology. She recently received the Szchenyi Award for her outstanding work.

Related article

In his laudation, Minister Ksler said Karik had cemented her place in the history of medicine with her research.Continue reading

The future festival Brain Bar was founded with the aim to discuss the future of humanity, both on an individual and community level. In previous years, guests have included star psychologist Jordan Peterson, paleontologist Jack Horner, and PayPal founder Peter Thiel.

Featured photo via Facebook

Continued here:
Five-Story Mural in Budapest to Pay Tribute to mRNA Pioneer Katalin Karik - Hungary Today

Read More..

Top Cyber Security Threats to Organizations – CIO Insight

Cyber security threats are a constant for organizations, whether they do business with the public or other organizations. Cyber threats are malicious attempts to gain unauthorized access to an organizations network, and the resources on the network.

Cybercriminals or hackers somewhere in the world are constantly attempting to infiltrate an organizations network, and these criminals pose a constant threat. Cyber threats can easily become cybercrimes if organizational leadership does not champion a cyber security program.

Its imperative that organizational leadership and senior management give the required manpower, training, and tools to mitigate cyber threats. Without support and buy-in from upper and middle management, an organization may expose itself to any number of cyber threats.

In 2020, cyber threats turned into mass data breaches that compromised user accounts, email addresses and credit card information. Some of this information was sold on the dark web.

Organizations must be vigilant in keeping cyber threats from becoming cybercrimes. Cyber threats are only prevalent today because they keep making money for cybercriminals. Cybercriminals value information that can generate immediate revenue, either directly or when sold on the dark web. They especially value the following types of business information:

Cybercriminals are motivated by the potential for stealing financial and intellectual property information; organizations must be equally motivated to eliminate or mitigate any cyber threats.

Cybercrimes are estimated to reach $10.5 trillion in damages annually by 2025, according to Cybersecurity Ventures. Further, Coalition found that ransomware was responsible for 41% of the cyber insurance claims payouts in the first half of 2020.

Any organization or person can be the target of a cybercriminal, but these criminals tend to favor soft targets with a higher potential payout. The most vulnerable organizations need to ensure management is fully invested in a sound cyber security program. According to CDNetworks, these are the most vulnerable industries:

Whether leadership is managing a financial institution or a small business, management staff must have a working understanding of cyber security risks in order to mitigate cyber threats.

Management personnel can ensure cyber security best practices are implemented by accessing sites like Center for Internet Security (CIS) or National Institute of Standards and Technology (NIST) to compare their current cyber security practices.

Being keenly aware of the most popular cyberattacks should be part of the required annual security training for any organization. Cyber threats can occur internally or externally.

These are the top internal cyber threats, according to Endpoint Protector.

These are the top five external cyber threats.

The best way to mitigate an internal or external cyber threat is to establish a clearly defined cyber security program that is disseminated to every employee within an organization. Whats more, no cyber security program can be successful if the program is not championed by leadership.

Read more: What Is Enterprise Security Management?

An annual or semiannual cyber security training program must be firmly established in the organization. Further, a refresher training session may be required if a new cyber threat is presented, or if repeated risky employee behavior is observed. A robust cyber security program also covers disciplinary actions for infractions committed by an employee.

Cyber security is the responsibility of every member in the organization, especially management. Cyber security assets (e.g., hardware and software) and training for the employees and DevOps staff are all essential to a successful cyber security program.

Organizational leadership and senior management are also essential to the success of a good cyber security program. Threat-conscious behavior must be exhibited daily by leadership.

Read next: Are Your Containers Secure?

Read this article:
Top Cyber Security Threats to Organizations - CIO Insight

Read More..

The Right Way to Structure Cyber Diplomacy – War on the Rocks

The modern State Department was forged in an era of global transformation. In the 1930s, the department had fewer than 2,000 personnel and, as one historian emphasized, it was a placid place that was comfortable with lethargic diplomacy. World War II revolutionized the department, which readily transformed itself to handle the demands of planning a new international order. Between 1940 and 1945, the departments domestic staff levels tripled and its budget doubled.

Today, the State Department is once again confronting the challenge of how to organize itself to cope with new international challenges not those of wartime, but ones created by rapid technological change. There are ongoing conversations about how the department should handle cyberspace policy, as well as concerns about emerging technologies like artificial intelligence, quantum computing, next generation telecommunications, hypersonics, biotechnology, space capabilities, autonomous vehicles, and many others.

As Ferial Ara Saeed recently emphasized, the department is not structured in a way that makes sense for addressing these matters. She is not alone in having this view, and others have also offered ideas for reform. Former Secretary of State Mike Pompeos proposal for a Bureau of Cyberspace Security and Emerging Technologies focused too narrowly on security, as Saeed correctly diagnoses. As an alternative, she proposes consolidating all technology policy issues under a new under secretary, who would report to the deputy secretary of state for management and resources.

The State Department should be restructured so that it can conduct effective cyber diplomacy, but establishing one bureau for all things technology-related is not the way to proceed. Conceptually, the core challenges for cyberspace policy are different from those related to emerging technology issues, and creating one all-encompassing bureau would generate multiple practical problems. Instead, the department should establish a Bureau of International Cyberspace Policy, as proposed in the Cyber Diplomacy Act. Consolidating cyberspace policy issues in a single bureau would provide greater coherence to overarching priorities and day-to-day diplomatic activities. Emerging technology issues should remain the responsibility of the appropriate existing bureaus. If they are provided with greater resourcing and if appropriate connective tissue is created, those bureaus will have greater flexibility in crafting individualized strategies for a very diverse array of technologies. At the same time, the department would be able to prioritize and adopt a strategic approach to technology diplomacy.

Cyberspace Matters Are Different from Other Technology Issues

Through our work as staff of the U.S. Cyberspace Solarium Commission, we have observed how cyberspace policy will have impacts on U.S. foreign policy and international relations that differ fundamentally from those produced by other technology issues. That is why cyberspace policy warrants a distinct foreign policy approach.

Unlike other technologies, cyberspace has created a new environment for international interaction. As Chris Demchak describes, cyberspace is a substrate that intrudes into, connects at long range, and induces behaviors that transcend boundaries of land, sea, air, institution, nation, and medium. Since the early 2000s, as one brief has put it, states have recognized cyberspace and its undergirding infrastructure as not only strategic assets, but also a domain of potential influence and conflict. At the same time, a lack of international agreement or clarity on key definitions compounds the difficulties of dealing with cyberspace as a new arena of state-to-state interaction.

A U.N. Group of Governmental Experts produced a consensus report outlining norms of responsible state behavior in cyberspace that was welcomed by the U.N. General Assembly in 2015. However, U.N. members were by no means agreed on how international law applies to cyberspace. Although that issue was addressed more successfully in 2021, diplomats are still negotiating critical questions like what counts as cybercrime, critical infrastructure, espionage, or many of the other foundational concepts in this area. All of these questions, and many others beyond the negotiations of the United Nations, have long-term implications for the future of the internet, as cyberspace policy experts navigate a path between security and surveillance, and between openness and authoritarianism. To be successful in this diplomacy, the State Department should prioritize these issues and provide its diplomats with organizational structures that will support Americas proactive leadership. In short, the State Department should have a dedicated cyberspace policy bureau.

The focus and activities of such a bureau would be functionally very different from what will be involved in addressing other technology issues. A Bureau of International Cyberspace Policy would be responsible for implementing a relatively established policy for cyber diplomacy. The head of the bureau would be working to ensure an open, interoperable, reliable, and secure internet, pushing back on authoritarian leanings in internet governance, and advocating for a multi-stakeholder model for the future of cyberspace. Certain details may change, but the core elements of this policy have been consistent across administrations and Congresses. Accordingly, the real added value of a cyberspace policy bureau is not in defining policy, but rather implementing that policy, which will require extensive engagement with non-aligned countries to help sway the balance of opinion toward an open internet, and international capacity-building efforts to help drive progress toward greater global cyber security.

By contrast, the challenge U.S. policymakers confront on emerging technologies is a question of establishing what Americas international policies and diplomatic strategies should be. As the National Security Commission on Artificial Intelligence observed in relation to the State Department, a lack of clear leadership on emerging technology hinders the Departments ability to make strategic technology policy decisions as part of a larger reorientation toward strategic competition.

Policymakers and officials working on emerging technologies will also face the challenge of adapting overarching policies as technologies emerge, develop, and ideally stabilize over time. Emerging technologies do not remain emerging indefinitely, and so an organizational structure that allows the development of cohesive strategies around these technologies should have the flexibility to shift between topics. Of course, cyberspace policy and the strategic considerations that guide it will also certainly need to adapt to changes, but its basic focus is likely to remain more stable. Much of Americas work in outlining cyberspace policy has already been done, and thus the missions that remain for example working with partners and allies on joint attribution of cyber attacks, rallying votes in the United Nations, and managing capacity building projects are unlikely to change dramatically any time soon.

Undoubtedly, there will be many areas of overlap between the work of those handling emerging technology issues and the responsibilities of a cyberspace policy office. But there will also be overlap between efforts on emerging technologies and matters handled by the Bureau of Economics and Business Affairs, the Bureau of East Asian and Pacific Affairs, the Bureau of International Security and Nonproliferation, and many others. The fact that there is overlap between two organizational constructs should not be taken as a justification to merge them, and while technology obviously plays a central role in both cyberspace policy and emerging technologies policy, the actual work required to address them is very different.

It also makes sense to keep some technology issues in their current bureaucratic homes because of their historical legacy and the subsequent development of specialized expertise within those homes. No one would suggest, for example, that emerging issues in nuclear technology should be pulled out of the Bureau of International Security and Nonproliferation and made the responsibility of a new emerging technology bureau. And some technologies might only have globally significant implications for a relatively short period of time. Advanced robotics, for example, might have a major impact on manufacturing and broader economic areas, which could require the sustained attention of policymakers as they grapple with the initial implications of such technology. But once advanced robotics become a routine part of industrial operations, it would make less sense to have brought the issue under a new bureau when the pre-existing functional and regional bureaus might be best poised to address the relevant challenges.

Making every technology policy the responsibility of one under secretary would not solve the State Departments current problems. Instead, it would result in unclear prioritization, strained resources, and would leave one leader handling two very different mission sets.

The Importance of Avoiding a Security-Focused Approach to Cyberspace

In creating a Bureau of International Cyberspace Policy, the State Department should also avoid limiting that bureaus focus solely to security-related matters. That was one of the flaws with the previous administrations efforts to create the Bureau of Cyberspace Security and Emerging Technologies. While that bureau never materialized, the Government Accountability Office roundly criticized the State Department for failing to provide data or evidence to support its plans and for its lack of consultation with other federal agencies. Rep. Gregory Meeks, the chairman of the House Foreign Affairs Committee, emphasized that the proposed office would not have been in a position to coordinate responsibility for the security, economic, and human rights aspects of cyber policy.

Any reorganization of the State Department should ensure that diplomats can take into account all dimensions political, economic, humanitarian, and security of cyberspace policy and elevate them within the department. That would allow a new bureau to lead the way in promoting a free and secure internet. Some of the reform proposals that have been put forward reflect this approach. For example, the Cyber Diplomacy Act, which has already passed in the House, would create an ambassador-at-large position, with rank equal to that of an assistant secretary, to lead a new cyber bureau. That person would report to the under secretary for political affairs or an official of higher rank, which leaves open the possibility that the position would report directly to the secretary of state or one of the departments two deputy secretaries. While some have proposed the deputy secretary for management and resources for this reporting chain, that position has a history of going unfilled, and having a new cyberspace bureau report to it is a recipe for undercutting the fledgling bureau before it can even get off the ground. A better alternative would be to allow the State Department some flexibility in determining a new bureaus reporting structure, which might include the more natural choice of reporting to the other deputy secretary.

An overly narrow focus on security is not the only trap to avoid in creating a new cyber bureau. Orienting it around the idea of strategic competition with China would also be a problem. No doubt China will remain a key driver of U.S. policy for years to come, but global threats and opportunities may look very different in future decades than they do now. Cyber diplomacy should not be oriented around one adversary specifically and the structure and functioning of a new cyberspace policy bureau should stand the test of time.

The Devil Is in the Details, But a Cyberspace Policy Bureau Is the Best Approach

The unfortunate political reality is that reorganizing the State Department is hard. That alone is not a reason to forgo reform, but it does introduce constraints on what may be feasible. Any new office or bureau will need leaders, but current law strictly limits the rank that they can hold. Creating a new under secretary, or even a new assistant secretary, would require significant changes to the State Department Basic Authorities Act, and there is limited political momentum for that particular undertaking. The law currently authorizes the appointment of 24 assistant secretaries and six under secretaries. Although the Cyberspace Solarium Commission initially recommended creating an assistant secretary position to lead a new cyber bureau and although it has been clear for two decades that the State Departments structure should be overhauled making such drastic changes to the necessary legislation may be a nonstarter on Capitol Hill for the foreseeable future. The Cyber Diplomacy Act provides the best available work-around by placing an ambassador-at-large at the head of the new bureau, ensuring that the position has the stature necessary for effective leadership.

The new bureau would also have to contend with the challenges of prioritization. The Cyber Diplomacy Act lists a wide variety of issues including internet access, internet freedom, digital economy, cybercrime, deterrence, and international responses to cyber threats that would become a cyberspace bureaus responsibilities. Even without giving it emerging technology topics to handle, consolidating just cyberspace policy issues will require careful planning to determine which pieces get pulled from existing bureaus. To allow a new bureau to adequately deal with digital economy matters, for example, policymakers would need to decide which aspects of that issue get moved from the purview of the Bureau of Economic and Business Affairs. The new bureau would have a good case for inheriting responsibility for portfolios like investment in information communications technology infrastructure abroad, particularly as it relates to cyber security capacity building, but there is a strong argument for other pieces like e-commerce to remain in their existing homes. The more bearing a particular teams work has on preserving an open, interoperable, reliable, and secure internet, the more it should be considered a strong candidate for incorporation into a new bureau.

Moving the responsibility for particular policy matters is not the only tool available, however. The Cyber Diplomacy Act creates an avenue for the new bureaus personnel to engage other State Department experts to ensure that concerns like human rights, economic competitiveness, and security have an influence on the development of U.S. cyber policy. The proposed Cyberspace Policy Coordinating Committee would ensure that officials at the assistant secretary level or higher from across the department can weigh in on matters of concern for their respective portfolios.

With a new cyberspace policy bureau, a coordinating committee, and enhancements to emerging technology capacity in its existing regional and functional bureaus, the State Department would be structured to handle the digital age effectively.

Natalie Thompson is a Ph.D. student in political science at Yale University. Previously, she was a research analyst for the U.S. Cyberspace Solarium Commission and a research assistant and James C. Gaither junior fellow at the Carnegie Endowment for International Peace, working with the Technology and International Affairs Program on projects related to disinformation and cyber security. She tweets at @natalierthom.

Laura Bate is a senior director with the U.S. Cyberspace Solarium Commission and a 2021 Next Generation National Security Fellow with the Center for a New American Security. Previously, she was a policy analyst with New Americas Cybersecurity Initiative and remains an International Security Program Fellow. She tweets at @Laura_K_Bate.

Image: State Department (Photo by Freddie Everett)

Excerpt from:
The Right Way to Structure Cyber Diplomacy - War on the Rocks

Read More..

FBI report looks at the tactics of a ransomware affiliate – Channel Daily News

Infosec pros can now study the tactics of a ransomware affiliate gang that has been attacking U.S. organizations since late last year, information which can help them defend against some attacks.

The intel comes from the FBI, which this week issued a report on a gang calling itself the OnePercent Group.

The name apparently comes from its threat to release one per cent of a victim organizations stolen data if a ransom isnt paid.

Affiliate groups are gangs that take advantage of the ransomware-as-a-service offerings of big ransomware developers like REvil/Sodinokibi, Darkside, Dharma, LockBit and others. For a monthly fee, affiliates get the bulk of a ransom, with the developer getting about 20 to 30 per cent of the payment.

According to the report, like most ransomware attackers the OnePercent Group sends out phishing email with an infected Microsoft Word or Excel attachment, with the payload executing through a macro. This leads to the download of the IcedID banking trojan. According to the Center for Internet Security, IcedID (also known as BokBot) is a modular banking trojan that targets user financial information and is capable of acting as a dropper for other malware.

This gang uses it to download the Cobalt Strike threat emulation software. A legitimate testing tool, it has become a favourite aid for threat actors. According to Malpedia, Cobalt Strike deploys an in-memory agent named Beacon on the victim machine which can be used for command execution, keylogging, file transfer, SOCKS proxying, privilege escalation, mimikatz (for saving authentication credentials), port scanning and lateral movement through and across networks. The FBI report notes this group uses Cobalt Strike in part to move laterally through PowerShell remoting.

For copying and exfiltrating data prior to deploying ransomware this gang uses rclone, a command line program, to manage files on cloud storage.

Once the ransomware is successfully deployed, the report says, the victim will receive phone calls with ransom demands through spoofed phone numbers. Victims are also provided a ProtonMail email address for further communication. The actors will persistently demand to speak with a victim companys designated negotiator or otherwise threaten to publish the stolen data. When a victim company does not respond, the report says, the actors send subsequent threats to publish the victim companys stolen data.

The report also includes indicators of compromise that security teams can watch for, including hashes associated with rclone.

The FBI urges organizations to do the following to reduce the odds of being victimized by ransomware. Its also good advice for fending off any cyber attack:

back-up critical data offline; ensure administrators are not using Admin Approval mode; implement Microsoft LAPS (Local Administrator Password Solution), if possible; ensure copies of critical data are in the cloud or on an external hard drive or storage device. This information should not be accessible from the compromised network; secure your back-ups and ensure data is not accessible for modification or deletion from the system where the original data resides; keep computers, devices, and applications patched and up-to-date; consider adding a coloured email banner that clearly identifies emails received from outside your organization. This helps alert users to malicious email that purport to be from fellow employees; disable unused remote access/Remote Desktop Protocol (RDP) ports and monitor remoteaccess/RDP logs; audit user accounts with administrative privileges and configure access controls to give users the least privilege needed for their work; use network segmentation to separate critical data; make users adopt multi-factor authentication with strong passphrases.

Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times. I can be reached at hsolomon [@] soloreporter.com

See the article here:
FBI report looks at the tactics of a ransomware affiliate - Channel Daily News

Read More..

Internet Security Market to Witness Astonishing Growth by 2027 | HPE, IBM, Intel and more – Research Interviewer – Research Interviewer

The report offers a complete research study of the global Internet Security Market that includes accurate forecasts and analysis at global, regional, and country levels. It provides a comprehensive view of the global Internet Security Market and detailed value chain analysis to help players to closely understand important changes in business activities observed across the industry. It also offers a deep segmental analysis of the global Internet Security market where key product and application segments are shed light upon. Readers are provided with actual market figures related to the size of the global Internet Security market in terms of value and volume for the forecast period 2021-2027.

The following Companies as the Key Players in the Global Internet Security Market Research Report are HPE, IBM, Intel, Symantec, AlienVault, BlackStratus, Check Point Software Technologies, Cisco, Cyren, Fortinet, F-Secure, Gemalto, Kaspersky Lab, Microsoft, Palo Alto Networks, RSA, Sophos, Trend Micro, Trustwave Holdings, Wurldtech Security Technologies.

Free Sample Report + All Related Graphs & Charts @ https://www.datalabforecast.com/request-sample/313547-internet-security-market

North America accounted for the largest share in the Internet Security market in 2020 owing to the increasing collaboration activities by key players over the forecast period

Detailed Segmentation:

Global Internet Security Market, By Product Type: Malicious software, Denial-of-service attacks, Phishing, Application vulnerabilities.

Global Internet Security Market, By End User: Government, Banking, financial services, and insurance (BFSI), Manufacturing, Information communication and technology (ICT), Retail, Healthcare.

Market Overview of Global Internet Security

Geographically, The Internet Security market report studies the top producers and consumers, focuses on product capacity, production, value, consumption, market share and growth opportunity in these key regions, covering: North America, Europe, China, Japan and others.

Grab Your Report at an Impressive Discount (Use Corporate email ID to Get Higher Priority) @ https://www.datalabforecast.com/request-discount/313547-internet-security-market

We are currently offering Quarter-end Discount to all our high potential clients and would really like you to avail the benefits and leverage your analysis based on our report.

Furthermore, Global Internet Security Market following points are involved along with a detailed study of each point:

Major Players: The report provides company profiling for a decent number of leading players of the global Internet Security market. It brings to light their current and future market growth taking into consideration their price, gross margin, revenue, production, areas served, production sites, and other factors.

Internet Security Market Dynamics: The report shares important information on influence factors, market drivers, challenges, opportunities, and market trends as part of market dynamics.

Global Internet Security Market Forecast: Readers are provided with production and revenue forecasts for the global Internet Security market, production and consumption forecasts for regional markets, production, revenue, and price forecasts for the global Internet Security market by type, and consumption forecast for the global Internet Security market by application.

Regional Market Analysis: It could be divided into two different sections: one for regional production analysis and the other for regional consumption analysis. Here, the analysts share gross margin, price, revenue, production, CAGR, and other factors that indicate the growth of all regional markets studied in the report.

Internet Security Market Competition: In this section, the report provides information on Competitive situations and trends including merger and acquisition and expansion, market shares of the top three or five players, and market concentration rate. Readers could also be provided with production, revenue, and average price shares by manufacturers.

Browse Full Report with Facts and Figures of Internet Security Market Report: https://www.datalabforecast.com/industry-report/313547-internet-security-market

Internet Security Market

Major Highlights of Internet Security Market in Covid-19 pandemic covered in report:

Market Competition by key manufacturers in the industry. Discussed Sourcing strategies, industrial chain information and downstream buyers data. Distributors and traders marketing strategy analysis focusing on region wise needs in covid-19 pandemic. Vendors who are providing a wide range of product lines and intensifying the competitive scenario in covid-19 crisis. Also highlights of the key growth sectors of Internet Security market and how they will perform in coming years.

Buy Full Copy Global Internet Security Report 2021-2027 @ https://www.datalabforecast.com/buy-now/?id=313547-internet-security-market&license_type=su

** The market is evaluated based on the weighted average selling price (WASP) and includes the taxes applicable to the manufacturer. All currency conversions used in the creation of this report were calculated using a certain annual average rate of 2021 currency conversion.

Crucial points encompassed in the report:

Customization Available

With the given market data, Researchers offer customization according to the companys specific needs. The following customization options are available for the report:

Regional and country-level analysis of the Internet Security market, by end-user.

Detailed analysis and profiles of additional market players.

About Us

Transforming Information into Insights

We pride ourselves in being a niche market intelligence and strategic consulting and reporting firm driven towards resulting in a powerful impact on businesses across the globe. Our accuracy estimation and forecasting models have earned recognition across majority of the business forum.

We source online reports from some of the best publishers and keep updating our collection to offer you direct online access to the worlds most comprehensive and recent database with skilled perceptions on global industries, products, establishments and trends. We at Data Lab Forecast, wish to assist our clients to strategize and formulate business policies, and achieve formidable growth in their respective market domain. Data Lab Forecast is a one-stop solution provider right from data collection, outsourcing of data, to investment advice, business modelling, and strategic planning. The company reinforces clients insight on factors such as strategies, future estimations, growth or fall forecasting, opportunity analysis, and consumer surveys, among others.

Contact:

Henry KData Lab Forecast86 Van Wagenen Avenue, Jersey,New Jersey 07306, United StatesPhone: +1 917-725-5253Email: [emailprotected]Website: https://www.datalabforecast.com/Follow Us on: LinkedIN | Twitter |

More Trending Reports by Data Lab Forecast:

View post:
Internet Security Market to Witness Astonishing Growth by 2027 | HPE, IBM, Intel and more - Research Interviewer - Research Interviewer

Read More..

Web hosting vs WordPress hosting: Which is best? – ITProPortal

If you're looking to make your website live, you'll need some form of web hosting. The best web hosting services provide the required infrastructure, including storage, bandwidth, and a suite of other features to get your website up and running, once you've used one of the best website builders to put it together.

In your search for hosting, you'll likely come across two terms: web hosting and WordPress hosting. Web hosting is an umbrella term that covers all kinds of hosting, including shared hosting, virtual private server (VPS) hosting, dedicated server hosting, and cloud hosting.

WordPress hosting also falls under this general term, but its a type of platform-specific hosting optimized for WordPress sites. With WordPress powering over 40% of all websites on the internet, it's certainly an option worth considering. But should you go for web hosting or the best WordPress hosting for your site?

In this web hosting vs WordPress hosting comparison, we'll compare their features to help you decide which one to go with. Although performance, support, and pricing depends largely on the provider you choose, well give you an inkling of what you can expect with each type of hosting. Read our WordPress review to find out more about the service, which occupies a unique space across website building and web hosting.

Often, web hostingincluding WordPress hostingcomes with included features such as a free domain, SSL certificates, email accounts, marketing features, and ecommerce features. But there are some key differences worth noting between general web hosting and WordPress hosting.

One key feature of WordPress hosting is that it only works for WordPress-based sites. In other words, if you're using CMSs like Drupal, Magento, Joomla, Squarespace, or Dreamweaver, you shouldnt choose WordPress hosting. However, most web hosting options can run WordPress with ease. So general hosting gives you far more flexibility in selecting a content management system.

One thing that stands out about WordPress hosting is that it usually comes with the WordPress software pre-installed on your server. You wont have to work through potentially confusing or difficult installation processes like you often have to with shared or other hosting, making dedicated WordPress hosting an attractive option for newbies.

With managed WordPress hosting, you will generally have full technical support, with a team on hand to install updates, including updates for your WordPress software and security patches. Such updates ensure that your website is robust and secure. This feature is particularly helpful if you arent tech-savvy, and dont want to manually handle such updates.

Other web hosting options can also offer some form of automatic updates, although you often have to go for more expensive plans to access these. Additionally, third-party integrations help you extend the functionality of your website. Thankfully, most web hosts have a wide range of third-party apps and widgets, including themes and plugins, to help you customize your site.

WordPress hosting is somewhat more limited than most other forms because it cant be used with other CMS or site creation platforms. However, WordPress itself has a vast library of plugins, themes and third-party apps and widgets that you can avail of.

Your websites performance affects everything from your brand reputation to sales. Any website that takes more than a few seconds to load is sure to turn visitors away. In some cases, WordPress hosting can offer a higher level of performance compared to general shared hosting options.

First, a WordPress hosts entire tech stack is typically designed to make your WordPress site perform better, whether were looking at uptime, page load times, server responses, or something else.

With WordPress hosting, youll get WordPress-specific caching that saves your data in the form of static files, so that your site can load faster. While other hosting options generally offer some kind of caching, they are not often WordPress-centered by default, and they tend to be a tad slower.

However, you can use a CDN (content delivery network) to improve the performance of your website, whether you're using WordPress hosting or web hosting. CDNs store your websites images, videos, and pages across several data centers. That way, the distance between a server and a user is reduced, and a single-point failure in one server doesnt affect the performance of your website.

It is important to note that even the most basic shared hosting can be configured to run a WordPress website in a fast, efficient manner. This will take a fair amount of work, though.

Unsurprisingly, the level of customer support you get depends on the kind of web hosting you choose. People on shared plans typically get access to a knowledge base, in addition to phone, live chat, and email support. But response times, especially by email, can be slow.

Higher paying customers, like those on VPS and dedicated hosting plans, often get priority customer support. Some hosts even offer a strategic team on standby (24/7) to help you address issues quickly.

The advantage of choosing WordPress hosting, especially managed WordPress hosting, is that you get a support team who are experts in WordPress. Managed WordPress hosting allows you to focus on your business, while the support team undertakes proactive monitoring to identify issues as soon as possible.

Take Kinsta's top-level WordPress team, for instance. They check the status of the websites they host every two minutes; that amounts to a whopping 720 checks each day. So they'll notice and get right on top of any website issues before you even know it.

Web hosting prices vary widely depending on the kind of site you want to run and the extra features you need. Shared hosting, the cheapest of the lot, ranges anywhere from less than $1 to $15 a month, or even higher. For instance, InMotion Hosting's shared plans start at $2.49 a month, and can go as high as $12.99 a month. Meanwhile, Bluehost's plans range from $2.95 a month to $13.95 a month.

VPS hosting can be much pricier, although low-end plans come in at less than $5 a month. However, the price can go as high as $80 a month, or even higher. For example, Bluehost's entry plan for VPS hosting costs $19.99 and the highest plan costs $59.99.Expect to pay at least $70 a month for dedicated hosting, increasing all the way to $300 a month depending on the extra features you choose.

WordPress hosting also has an extensive pricing structure. You can get it as cheap as basic shared hosting in many cases, and prices can reach thousands per month for custom enterprise solutions. InMotion's WordPress hosting starts at $4.99 a month, but you have to be ready to pay for three years.

WPEngine has one of the more expensive managed WordPress hosting plans we found, with costs rising to about $241 a month.

Which is best: web hosting or WordPress hosting? The short answer is that it depends. If you are running a WordPress site, choosing WordPress hosting is a no-brainer. You'll get better features, higher performance, and more effective support.

That said, several web hosts offer WordPress-focused hosting packages that are worth looking at. Although these packages are not WordPress hosting, they have features tailored for good performance of WordPress sites. However, if you are running your website on other CMSs, we recommend choosing from the wide range of web hosting options out there.

For small businesses and people building their first site, shared or WordPress hosting will likely be your best bet. If you want more space, bandwidth, and control over your website, choose between VPS hosting, dedicated hosting, and cloud hosting.

The rest is here:
Web hosting vs WordPress hosting: Which is best? - ITProPortal

Read More..

Bedrock modernizes seafloor mapping with autonomous sub and cloud-based data – TechCrunch

The push for renewable energy has brought offshore wind power to the forefront of many an energy companys agenda, and that means taking a very close look at the ocean floor where the installations are to go. Fortunately Bedrock is here to drag that mapping process into the 21st century with its autonomous underwater vehicle and modern cloud-based data service.

The company aims to replace the standard big ship with a big sonar approach with a faster, smarter, more modern service, letting companies spin up regular super-accurate seafloor imagery as easily as they might spin up a few servers to host their website.

We believe were the first cloud-native platform for seafloor data, said Anthony DiMare, CEO and co-founder (with CTO Charlie Chiau) of Bedrock. This is a big data problem how would you design the systems to support that solution? We make it a modern data service, instead of like a huge marine operation youre not tied to this massive piece of infrastructure floating in the water. Everything from the way we move sonars around the ocean to the way we deliver the data to engineers has been rethought.

The product Bedrock provides customers is high-resolution maps of the seafloor, made available via Mosaic, a familiar web service that does all the analysis and hosting for you a big step forward for an industry where data migration still means shipping a box of hard drives.

Normally, DiMare explained, this data was collected, processed and stored on the ships themselves. Since they were designed to do everything from harbor inspections to deep sea surveys, they couldnt count on having a decent internet connection, and the data is useless in its raw form. Like any other bulky data, it needs to be visualized and put in context.

Image Credits: Bedrock

These data sets are extremely large, tens of terabytes in size, said DiMare. Typical cloud systems arent the best way to manage 20,000 sonar files.

The current market is more focused on detailed, near-shore data than the deep sea, since theres a crush to take part in the growing wind energy market. This means that data is collected much closer to ordinary internet infrastructure and can be handed off for cloud-based processing and storage more easily than before. That in turn means the data can be processed and provided faster, just in time for demand to take off.

As DiMare explained, while there may have been a seafloor survey done in the last couple decades of a potential installation site, thats only the first step. An initial mapping pass might have to be made to confirm the years-old maps and add detail, then another for permitting, for environmental assessments, engineering, construction and regular inspections. If this could be done with a turnkey automated process that produced even better results than crewed ships for less money, its a huge win for customers relying on old methods. And if the industry grows as expected to require more active monitoring of the seafloor along every U.S. coast, its a win for Bedrock as well, naturally.

Image Credits: Bedrock

To make this all happen, of course, you need a craft that can collect the data in the first place. The AUV is a piece of technology we built solely to enable a data product, said DiMare, but noted that, originally, we didnt want to do this.

We started to spec out what it looked like to use an off the shelf system, he explained. But if you want to build a hyper-scalable, very efficient system to get the best cost per square meter, you need a very specific set of features, certain sonars, the compute stack by the time we listed all those we basically had a self-designed system. Its faster, its more operationally flexible, you get better data quality, and you can do it more reliably.

And amazingly, it doesnt even need a boat you can grab it from the back of a van and launch it from a pier or beach.

From the very beginning one of the restrictions we put on ourselves was no boats. And we need to be able to fly with this thing. That totally changed our approach, said DiMare.

Image Credits: Bedrock

The AUV packs a lot into a small package, and while the sensor loadout is variable depending on the job, one aspect that defines the craft is its high-frequency sonar.

Sonars operate in a wide range of frequencies, from the hundreds to the hundreds of thousands of hertz. Unfortunately that means that ocean-dwelling creatures, many of which can hear in that range, are inundated with background noise, sometimes to the point where its harmful or deters them from entering an area. Sonar operating about 200 kHz is safe for animals, but the high frequency means the signal attenuates more quickly, reducing the range to 50-75 meters.

Thats obviously worthless for a ship floating on the surface much of what it needs to map is more than 75 meters deep. But if you could make a craft that always stayed within 50 meters of the seabed, its full of benefits. And thats exactly what Bedrocks AUV is designed to do.

The increased frequency of the sonar also means increased detail, so the picture its instruments paint is better than what youd get with a larger wave. And because its safe to use around animals, you can skip the (very necessary but time-consuming) red tape at wildlife authorities. Better, faster, cheaper and safer is a hell of a pitch.

Today marks the official launch of Mosaic, and to promote adoption Bedrock is offering 50 gigs of free storage of any kind of compatible map data, since the platform is format-agnostic.

Theres a ton of data out there thats technically public but is nevertheless very difficult to find and use. It may be a low-detail survey from two decades ago, or a hyper-specific scan of an area investigated by a research group, but if it were all in one place it would probably be a lot more useful, DiMare said.

Ultimately we want to get where we can do the whole ocean on a yearly basis, he concluded. So weve got a lot of work to do.

See original here:
Bedrock modernizes seafloor mapping with autonomous sub and cloud-based data - TechCrunch

Read More..

The pandemics long-term legacy on cloud adoption rates and what comes next – ITProPortal

After a year of hardship and disruption, many UK businesses will have viewed Julys Freedom Day and its easing of the remaining Covid restrictions as a welcome return to some sort of normality. The long-awaited return to the office and face-to-face meetings is finally here. However, very few organizations will be planning a reversion to the old paradigms.

The pandemic accelerated a trend that was already well underway; cloud adoption. During the first weeks of chaos, the need to facilitate a mass move towards remote working, boost resilience and switch to online customer communications, meant that the use of cloud services was no longer optional. And, with more than 4 in 5 UK businesses already stating plans to implement hybrid strategies moving forward, cloud workloads are only likely to increase even further.

However, cloud environments come with their own unique set of data-related challenges. Whether it's integration or security, businesses need to act now to get one step ahead.

Although many organizations have been on the journey to the cloud for several years in some way or another, the pandemic undoubtedly created a new impetus. It forced business leaders across all sectors to reevaluate their priorities and more often than not cloud adoption came out on top.

In fact, Denodos 2021 Global Cloud Survey discovered that cloud adoption year-on-year has risen 25 percent in advanced cloud workloads, with more complex workloads moving to the cloud. It also reported that almost 40 percent of the 150 businesses questioned are now leveraging hybrid cloud (using combined on-premise and cloud services) and some 9 percent who have extended their architectures to multi-cloud (more than one cloud service).

There are several reasons why the pandemic accelerated cloud adoption at a mass scale. These include:

- The need to facilitate remote working This is perhaps the most apparent change brought on by the pandemic. In an effort to follow government guidelines, keep employees safe and limit the spread of the virus, offices closed their doors and remote work took hold. The need to collaborate online instead of face to face meant that cloud technologies were more essential than ever before. Even industries that were typically slow to adopt due to lack of funding or resources such as the public sector had no option when the pandemic hit.

- The need to continue to deliver services to customers in a digital landscape Traditional retailers and service providers needed to accelerate their transformation from face-to-face, store-based customer interaction to online services. Whether it was putting in place new service models for physical goods such as home-delivery and click and collect or adopting new ways to deliver informational services and advice such as healthcare Software as a Services (SaaS) applications or e-learning all industries needed to reassess how they communicated with their customers in a new digital world. This is where the cloud came in.

- The need to increase resilience and agility The first few months of the pandemic, especially, caused fluctuations in the demand for specific services. Cloud-enabled businesses to be more flexible and scale according to this demand. It also helped businesses to rapidly adapt to new commercial models and launch innovative products and services to gain a competitive advantage, even during this time of chaos.

There is no doubt that cloud technologies became a lifeline for many businesses over the last 18 months, enabling them to survive one of the toughest economic periods in recent history. However, whilst the mass adoption of cloud technologies solved some of the immediate problems caused by the pandemic, it has also created a very unique set of data-related challenges. If organizations are to continue to reap the benefits of cloud long-term, they need to act now and put the processes in place to thrive in our new hybrid landscape.

Businesses typically find themselves struggling during three stages of maturity in the cloud deployment journey:

A common thread throughout these stages is data integration. When moving to the cloud the huge increase in data volumes being stored across multiple sources can make it difficult to keep track. This exacerbates existing challenges around data security and maintaining governance and compliance as well as other regulatory practices such as data sovereignty, lineage and ownership.

To make matters worse, traditional means of moving and copying data are no longer fit for purpose. In fact, the most common method of data integration, Extract Transform Load (ETL) where data files are extracted from an existing source, transformed into a common format and loaded onto a new location has been around since the 1970s. Its no surprise that this techniques limitations, most prominently around complexity, performance and security and governance, are becoming increasingly apparent in our data-intensive age. This is where data virtualization comes in.

Data virtualization complements ETL and other methods of integration such as the Enterprise Service Bus (ESB) and Enterprise Applications Integration (EAI) by removing the need to move and copy data during the journey to the cloud. This gives agility and self-service capabilities to the business, without compromising on security and governance.

As consumers, many of us actually already use a very similar model in the home when consuming entertainment through services such as Netflix and Spotify. They dont hold the physical content in their homes, on DVDs, Blu Rays or CDs. Instead, they will review the information about the film or music (the meta-data) on a platform to decide what they want and, when selected, that content is viewed in real-time from some unknown location in the cloud.

Data virtualization works like this, but for enterprises. Using this technology, the data is kept at the source and only abstracted and consumed in a report, dashboard or application in real-time, when it is needed. This is completely different from the bulk moving and copying of data used in ETL and data warehousing models. It is enabling businesses to gain real-time data insights, to vastly reduce the movement of data around the enterprise and provide centralized security and data governance irrespective of the data sources. For example, for a user who wanted to run a query for a new analytics report, a data virtualization layer would hold details about all the data they might want to consume. It would only be at runtime however, that the system would abstract and combine the actual data from the data source locations for their request. Much like the domestic model for entertainment described above, the content is provided only as and when it is needed.

With cloud technologies set to play an increasingly important role moving forward, organizations need to act now to ensure that they are able to overcome any data-related challenges and maximize value from their pandemic investment. Modern technologies, such as data virtualization, could bring enormous agility to businesses, removing the complexities from hybrid and multi-cloud architectures as users no longer need to worry about where the data is held or what format or protocol is needed for access. Adopting these technologies could help businesses to thrive, no matter what is around the corner.

Charles Southwood, Regional VP Northern Europe and MEA, Denodo

See the original post:
The pandemics long-term legacy on cloud adoption rates and what comes next - ITProPortal

Read More..

uk – Your authority on UK local government – How the cloud is powering the future of flexible working – LocalGov

After months of the new normal, local councils have experienced the true benefits of remote working, including increased flexibility and no commuting, and seen the financial rewards with huge cost efficiencies. As a result, whilst some plan on either returning fully to the office or remaining WFH in the foreseeable, a great deal of others will adopt a hybrid model of both on-site and remote locations, including local councils.

And it goes without saying that this hybrid model of working is fundamentally allowed by cloud computing.

Flexible workforce

By hosting applications in cloud services rather than on-site, public services such as councils give users access to their resources in real-time, from anywhere, on any device as long as theyre connected to the internet. Multiple employees are able to collaborate on centrally stored projects and exchange ideas instantly, even if from different locations. This enhances the quality of the work being produced, as well as of each individuals time-efficiency. Business leaders have in fact noticed an increase in levels of productivity, with using online tools for project management, video conferencing and file sharing. A survey by TalkTalk Group, for instance, found out that 58% of workers in the UK have been more productive as a result of working from home, while over half (52%) said they never expect to return to a five day working week in the office.

Now, imagine a post-pandemic world where cloud computing didnt exist working from home wouldve been catastrophic, to say the least. Instead of collaborating on planning policies or housing benefits at the same time, public servers would have to rely on the never-ending exchange of emails containing feedback and iterations. And, as you can imagine, its easy to lose track of the newest version of a document with so many messages flying around.

Cost-efficiency

Cloud solutions are much easier to scale than traditional data centres and servers. A council could temporarily scale back or increase resources as it pleases, to accommodate staff members or changes in traffic. Instant activation of services and speedy adjustments are key components to support growth. They also provide much needed agility.

Additionally, the cloud significantly reduces expenses on hardware and infrastructure, consequently reducing IT budgets. Think about it: when you buy an office and all of the equipment to fill it with, your capital expenditure is high, even if its a one-off payment. However, by moving to the cloud, you will move to an operational expenditure model, where you pay-as-you-go for the resources you use.

Increased security

To say that accessing sensitive data from outside of the office requires a robust cybersecurity strategy is to state the obvious. Security and compliance are one of the top concerns for most businesses, but especially for those in the public sector. Councils, in particular, are a goldmine for cyber criminals when considering how sensitive is the data they hold on their constituents Hackney Council's hack and Redcar and Cleveland's ransomware attack being recent examples.

The truth is, cyber crime against councils is likely to continue to grow. Migrating to the cloud and away from legacy infrastructure can help organisations reach high standards of security by hosting their website, data and applications in enhanced data centres that run on enterprise-grade networking.

If you havent yet migrated to the cloud, its important to look into providers compliance and accreditations to boost the overall level of security (both cyber and physical) of network and digital information systems. Id also recommend looking for cloud servers with no single point of failure design and 100% uptime guarantee, ensuring seamless continuity of service.

The future is in the cloud

The COVID-19 pandemic has clearly highlighted the importance of cloud computing. Working in the cloud has been tried and tested. If council leaders were uncertain of its benefits before, they are likely convinced of its potential now. Moving forward, the cloud will continue to be the solution that allows for flexible working, with employees dispersed in multiple locations, may that be in the office, at home, or from their favourite coffee shop.

Jon Lucas is co-director of Hyve Managed Hosting

Read more:
uk - Your authority on UK local government - How the cloud is powering the future of flexible working - LocalGov

Read More..

The next normal for business growth: CXOs share their experiences of developing new tech-driven solutions – YourStory

The pandemic affected almost every industry and business function, resulting in short-term pivots and a shift in priorities for many. For others, it validated their offerings. Conversations with industry leaders reveal that organisations that coped better to the fast-changing external environment brought about by the COVID-19 pandemic were those that leveraged technology to shape solutions for the new normal.

The latest edition of the Digital Pioneers Club CTO Roundtable series, hosted in association with Google Cloud brought to the fore these stories of resilience and innovation. Leading the conversations were Manish Mittal, Managing Principal - India, Axtria; Jayateerth Mirji, VP Technology, TeamLease; Vijay Sivaram, CEO, Quess IT Staffing, Quess Corp; Mani Rangarajan, Group COO, PropTiger and Mitesh Agarwal, Head of Customer Engineering, Google Cloud India.

The weeks that followed after the onset of the pandemic saw businesses creating new opportunities by bringing about an acceleration in adopting new business models. For instance, real estate advisory PropTiger launched a virtual platform PropTiger Direct where customers could experience digital home-buying. It provided access to multiple projects in different localities of a city, access to digital brochures, project and locality videos, pre-recorded webinars with experts from the area, virtual site tours and complete coverage of site location and locality through drone shoots. Mani points out that there were a lot of naysayers about real estate at the start of the pandemic, however contrary to the market expectations the sector witnessed a resurgence in consumer interest aided by adoption of remote working, hybrid working and the fact that people felt safest in their own homes. In addition, the sector saw the entire ecosystem of realty developers, brokers, house owners, and consumers coming forward to embrace digitisation. We saw a massive change in people's outlook towards digitisation, he said.

Quess Corp, a leading business services provider offering services like workforce management, asset management among others, to 2,600+ customers across 10 countries reimagined three pivotal requirements of organisations the pandemic had displaced - people, productivity and technology. We needed to figure out how people could work in different formats - be it work from home, hybrid or back to office. We also had to ensure that there are systems in place for employees to stay productive and build metrics that could define productivity under the new circumstances. From a technology perspective, we needed structural infrastructure that could ensure smooth day-to-day operations. So, we needed to make sure that people, productivity and technology are in place at all times and keep improving, shared Vijay.

Counted among one of Indias leading human resource companies, TeamLease Services saw an acceleration in transforming from a technology-enabled company to becoming a technology-driven company. Services like hiring, onboarding, employee cycle management, financial reconciliation, etc which are enabled by technology are at the core of our business. In addition, we also have SaaS products like Digital Workforce Solutions that addresses some of the most critical challenges faced in workforce management. In the last one year, we saw an increase in the adoption of technologies which in turn brought in an extensive acceleration in the evolution of our technology offerings with new capabilities and features, shared Jayateerth.

At Google and Google Could, the way we have worked with organisations in the past is phenomenally different from the last 18 months, shared Mitesh. He underlined that Googles work with organisations have traditionally focused on three broad areas. First, how Google can help in an organisations cultural transformation while embracing technology. Second, how to enable business build scale, security and resilience by enabling access to the same set of technologies irrespective of the size of the business. Here, Google Cloud has been a big lever in terms of democratising access to technologies, he said. Third, how to enable organisations to make better data-driven discussions.

He shared, Bringing all the three together - organisational and cultural transformation, access to best-in-class technology and data-driven decision-making together create a very powerful condition and that unleashes innovation and transformation. And, this is what we've been doing with a number of customers across the globe including India. Mitesh highlighted Google Clouds recent partnership with Indian telecommunications giant Reliance Jio Infocomm to build solutions in 5G for the enterprise and consumer segments as an example in the direction. In a very short period of time, India will have its first 5G phone, powered by Google Cloud.

Data Warehousing has been around for a long long time. But what has really changed in the last couple of years has been the journey of data to the cloud, shared Manish of Axtria, a global provider of cloud software and data analytics to the life sciences industry. Manish highlighted that cloud providers like Google Cloud and others in the industry today provide security and privacy, which are critical for enterprises.

In addition, privacy regulations such as The Health Insurance Portability and Accountability Act (HIPAA), the US federal law that requires the creation of national standards to protect sensitive patient health information and the General Data Protection Regulation 2016/679 (GDPR), the European Union law on data protection and privacy have further built the consumer and industry confidence. This has enabled enterprises to get comfortable with taking data to the cloud, building their data lakes, data warehouses, etc. and make way for structured and unstructured data to be made available to business owners without having to deep-dive into the tech stack. And, this has accelerated in the last 18 months. Five years ago, enterprises were only talking about applications and cloud, but they are now looking at hosting data on the cloud. It is no longer a question mark anymore.

Mitesh highlighted that various definitions of multi-cloud environments exist and are often accurate given the context, most organisations when they look at multi-cloud, they're looking at it for growth. Here, he pointed out how Google Clouds Anthos is an innovative product that becomes a model platform for both hybrid and multi-cloud environments. Anthos unifies the management of infrastructure and applications across on-premises, edge, and in multiple public clouds with a Google Cloud-backed control plane for consistent operation at scale. We released Google Cloud Anthos a couple of years back and have built some fantastic use cases. We have organisations in banking, telecom manufacturing, and life sciences that have really started using this. Anthos enables organisations to freely choose where they want to deploy and provide them the ability to manage it in a unified way, he said.

He predicted that 90 percent of the enterprises globally will move towards a multi-cloud strategy, making Anthos highly relevant for the market. With Anthos customers get the freedom of choice to chart their own cloud as well as transformation journey. In addition, Anthos also provides the capability to run BigQuery on multi-cloud.

The discussion also highlighted how startups can build new tech products by keeping the needs of the business and customer at its core. PropTigers Mani shared, We are a proptech company, and we have technology firmly in our DNA. As a tech-driven company we have realised that you should not build a technology which requires a market, instead you need to focus on the market and understand the kind of technology solutions that fit in. He shared an example of how at Housing.com they saw an opportunity to increase the customer engagement level on the platform via digitisation in the early days of the pandemic. Unlike e-commerce marketplaces which have a high level of engagement, customers visit property portals like Housing.com only when they want to buy, rent or sell a house. We realised there are a lot of periphery activities around buying, renting or selling that open up opportunities to engage with the customer, he explained.

This observation led PropTiger to launch Housing Edge- a full-stack rental and allied services platform, through which the company enabled digitisation of multiple services that tenants and landlords can avail. These services include packages like online rent payment, rental agreements, tenant verification, packing and moving, furniture rental, home interiors, and home services. The move has led to an exceptional increase in engagement because of the product-market fit. Today, the platform sees about 300,000 transactions being done every month with a high repeat usage rate. About 80 percent of people who come to pay rent on the platform tend to use the product every month, he explained.

The discussion also saw the panelists sharing the technologies they have used extensively to build and strengthen their product offerings and the impact it has had on their growth and digitisation journey.

Visit link:
The next normal for business growth: CXOs share their experiences of developing new tech-driven solutions - YourStory

Read More..