Page 2,742«..1020..2,7412,7422,7432,744..2,7502,760..»

Advancing industry convergence through technology and innovation – MIT News

Launched in October 2020, the MIT and Accenture Convergence Initiative for Industry and Technology is intended to demonstrate how the convergence of industries and technologies is powering the next wave of change and innovation. The five-year initiative is designed to advance three main pillars: research, education, and fellowships. As part of the third pillar, Accenture has awarded five fellowships to MIT graduate students working on research in industry and technology convergence who are underrepresented, including by race, ethnicity and gender. The recipients of the inaugural Accenture Fellows program are working across disciplines including electronics, textiles, machine learning, economics, and supply chain. Their research has the potential to advance innovation and technology to influence industry convergence and to broaden the convergence process to virtually all industries through creative problem-solving, the accelerated adoption of new technologies, unique collaborations, and thinking imaginatively and boldly.Accenture has long focused on how creativity and ingenuity can help solve some of the worlds most complex problems. When we wanted to explore the convergence of industry and technology, we turned to MIT to extend our longstanding partnership with education, research, and fellowships that delved deeper into this topic, says Sanjeev Vohra, global lead of applied intelligence at Accenture. The Accenture Fellows awards underscore our strong commitments to education, innovation, research and discovery, and creating opportunities that will help accelerate the achievements of these future champions of change.Research being conducted by the fellows covers an array of critical work, including: developing robot-aided therapy to improve balance in impaired subjects; leveraging the increasing availability of data in the gig economy; using machine learning to process locally generated waste for use as alternative energy in low-income municipalities; examining operational challenges that may arise from barriers to extending credit and sharing information among supply chain partners; and designing and applying electronic textile technology to low-Earth orbit, prompting an opportunity for convergence among the electronics, textile, and space technology industries.These fellows are prime examples of the incredible cross-disciplinary work happening at the nexus of industry and technology, says Anantha Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. We are tremendously grateful for Accentures commitment to our students, and for their goal of supporting and advancing student innovation and discovery through these fellowships.Student nominations from each unit within the School of Engineering, as well as from the four other MIT schools and the MIT Schwartzman College of Computing, were invited as part of the application process. Five exceptional students were selected as the inaugural fellows of the initiative:Jacqueline Baidoo is a PhD student in the Department of Materials Science and Engineering, exploring policy related to materials use. Specifically, her research is focused on waste-to-energy (WTE) strategies that could be adopted at the municipal level to treat and process locally generated waste for use as alternative energy. Her goal is to use machine learning to reduce the barrier to entry of WTE practices in low-income municipalities through the development of a tool that informs municipal decisions around waste management and the construction of WTE facilities. Baidoo earned a BS in chemistry and BA in physics from Xavier University of Louisiana and a BS in chemical and biomolecular engineering from Georgia Tech.Juliana Cherston is PhD student in the Media Lab. Her work in the Responsive Environments Group is focused on bringing electronic textile technology to low-Earth orbit, prompting an opportunity for convergence among the electronics, textile, and space technology industries. Specifically, she is augmenting large area space fabrics with active sensory functionality, weaving vibration-sensitive piezoelectric fibers and charge-sensitive conductive yarns into these specialized materials. Cherston earned a BA in physics and computer science from Harvard University.Olumurejiwa Fatunde is a PhD student studying in the Center for Transportation and Logistics. Her research examines operational challenges that may arise from barriers to extending credit and sharing information among supply chain partners in informal settings. With the proliferation of novel payment platforms, cryptocurrency usage, and natural language processing, Fatunde postulates that there is an opportunity to drive convergence across financial services, telecommunications, and other customer-facing industries in emerging markets. Specifically, she is investigating how technologies could trickle down to the smallest, least-formal organizations, helping them to create value for consumers and to be a part of the global economy. Fatunde earned a BA in biomedical engineering from Harvard University and an MS in international health policy from the London School of Economics in the U.K.Andr Medeiros Sztutman is a PhD student in the Department of Economics. Leveraging the increasing availability of data in the gig economy, his work focuses on the development of tools for tackling adverse selection in insurance markets. By creating tools that make better use of information especially in situations where it is particularly needed he is contributing to the convergence of different industries: gig platforms, reporting agencies, and the insurance business. Medeiros Sztutman earned a BS in economics from the Universidade de Sao Paulo, Brazil and an MS in economics from Pontificia Universidade Catolica do Rio de Janeiro in Brazil.Kaymie Shiozawa '19 is a masters student in the Department of Mechanical Engineering, exploring how robot-aided therapy could potentially address the challenge of improving balance in impaired subjects. Drawing on her experience designing human subject experiments, applying machine learning and mathematical simulations, and designing complex mechanisms for robotics and medical devices, Shiozawa aims to design a variable impedance cane and a novel protocol known as AdaptiveCane, which encourages unaided balance by progressively reducing the level of assistance provided as a users performance improves. Shiozawa earned an BS in mechanical engineering from MIT.

Originally posted here:

Advancing industry convergence through technology and innovation - MIT News

Read More..

Verifying the Universe with Exascale Computers – HPCwire

The ExaSky project, one of the critical Earth and Space Science applications being solved by the US Department of Energys (DOEs) Exascale Computing Project (ECP), is preparing to use the nations forthcoming exascale supercomputers. Exascale machines will enable the ExaSky team to verify the gravitational influences, gas dynamics, and astrophysical inputs that they use to model the universe at unprecedented fidelity, as well as address forthcoming challenge problems to predict and replicate high-accuracy sky survey data.

Explaining his work for a general audience, Salman Habib, the director of Argonnes Computational Science Division and an Argonne Distinguished Fellow, notes, The ExaSky team is adapting our Lagrangian-based Hardware/Hybrid Accelerated Cosmology Code (HACC) and adaptive mesh refinement cosmology codes (Nyx) to run on GPU-accelerated exascale hardware. These machines will give us the ability to incorporate more complex physics arising from diverse inputs, such as the presence of massive neutrinos, models of star and galaxy formation, and several sources of astrophysical feedback, such as active galactic nuclei, galactic winds, and supernova explosions. These will be incorporated into both codes and run on larger grids with finer resolution. The idea is that the similar physical models in both codes should provide similar results at many different scales even though the two codes utilize different computational approaches. Obtaining similar results from both simulations helps validate our understanding of the physical processes that are occurring in nature. After that, we can add new features like star formation to replicate via simulation, observed sky survey data to verify our results and make the simulation come alive.

Habib continues, The ExaSky effort has wide impact, as it gives scientists a computational tool to assist in the verification of gravitational evolution, gas dynamics, and the subgrid models used in the ExaSky cosmological simulations when run at a very high dynamic range. ExaSky is an important application effort for addressing forthcoming DOE challenge problems.

A Crisis in Cosmology?

Understanding the accelerated expansion of the universe is one of the scientific questions that the ExaSky team aims to investigate.

Observations of the universe confirm that theuniverse is expandingand the expansion rate is increasing with time. The underlying cause of this acceleration is not understood, and cosmologists refer to it generally as dark energy, a convenient shorthand coined 20 years ago for encapsulating this lack of understanding.

Cosmic acceleration and other similar insights were enabled by several observational advances coupled with improved theory and modeling. The current model of cosmology, which includes ingredients such as dark energy and dark matter (a form of matter that interacts gravitationally in the normal way but has very weak interactions, if any, with atomic matter), provides a very good description of astronomical and cosmological observations. Small discrepancies do exist, and there is an uncertainty as to whether these discrepancies indicate new physicswhich would be very excitingor are the result of measurement artifacts because cosmological measurements are often complex and difficult to obtain and measure. One such discrepancy is the so-called Hubble tension, which stipulates that the current rate of the universes expansion, as estimated by different techniques, shows a moderate level of disagreement. Another potential problem relates to how galaxies cluster. Galaxies are not randomly distributed in the universe but follow a well-measured statistical distribution. The measured clustering can be used to predict gravitational lensing (i.e., the distortion of shapes of background objects by intervening matter), but the measured lensing signal is too low.

If these discrepancies are signposts pointing to new discoveries, then they could result in an extremely exciting series of watershed moments that advance our understanding of the universe and uncover new aspects of the fundamental physics of matter and its interactions. Potential impacts include a possible modification of general relativity at large distances and the addition of new sectors to the Standard Model of particle physics.

Examining the Fundamental Properties of Matter

Consistent with the expansion of the universe as time moves forward, the energy density of the universe must increase as we go back in time. Thus, the universe functions as a sort of particle accelerator, allowing access to higher and higher energies the deeper into space and time we can look. Habib notes that scientists use this type of information to examine fundamental properties of matter, such as the mass of neutrinos. The analysis of current cosmological observations, such as the anisotropies in the temperature of the cosmic microwave background or the distribution of galaxies at large length scales, provides an upper bound on the sum of neutrino masses.

Habib believes that scientists can also use ExaSky simulations to examine other scientific problems, such as the nature of dark matter and the nature of primordial fluctuations in the cosmic microwave background. Succinctly, tiny temperature variations or fluctuationsat the part-per-million level in this afterglow radiation left over from the Big Bangcan offer great insight into the origin, evolution, and content of the universe.[i]

Simulating Data with Strict Observational Accuracy Requirements

Tying simulation to observed data is a necessary step in validating any computer model. The ExaSky team plans to verify its simulation results against data gathered from sky-survey observations.

TheExaSky page on the ECP websiteprovides a more detailed description of the sky survey data and the challenge problems that are being addressed by the ExaSky team. A summary of this description is provided as follows.

Technical and scientific details of the challenge problems can be found in theExaSky/HACC CoPA Tutorialpresented at the ECP Annual Meeting on February 6, 2020.

Using GPUs from AMD, NVIDIA, and Intel

Habib notes that their codes are performing well on all platforms in preparation for the exascale future, including GPUs from AMD, NVIDIA, and Intel.

Both codes have now been ported to Intel, NVIDIA, and AMD GPUs. The Heterogeneous-Computing Interface for Portability (HIP) translation layer was used to create code for the AMD GPUs. For the Intel GPUs, Habib notes that, We program close to the metal and are using the new Intel GPU hardware and OneAPI software. We are doing well, but its not a direct translation from CUDA.

A Strict Measure of Performance

To measure performance, the team uses a very stringent figure of merit (FOM), as shown in Figure 1. The FOM is a quantitative metric of an applications scientific work rate. As the code is optimized to run faster and/or with more complex physics, the FOM increases.

Habib observes that the HACC code can approach peak floating-point performance on a device because much of the local particle interaction computation fits in the register memory. On GPUs, register memory is the only memory subsystem that can support peak floating-point performance.[iii]The register memory is implemented as anarray of processor registers inside a register fileon each of the GPUs streaming multiprocessors. If a calculation exceeds the capacity of the GPU register file, then register spilling occurs when some of the computation is offloaded to slower memory. Using slower memory can incur a significant performance penalty, which would prevent an application from realizing peak floating-point performance.[iv]

Habib noted, Assuming codes scale from Summit to an exascale platform, FOM ratios of 20 of performant codes on Summit imply a factor of roughly 100 at the exascale, which is impressive. The current scaling on Summit is shown in Figure 2. The projected FOM is a measured value of the FOM in which projected assumes that the scale-up on an exascale system will be successful.

Summary

To date, the ExaSky team reports that it has successfully incorporated gas physics and subgrid models within its codes and has added advanced software technology to analyze simulation data. The teams next steps include adding more physics and, once ready, testing the software on next-generation hardware as the systems come online.

[i]Fluctuations in the Cosmic Microwave Background, Wilkinson Microwave Anisotropy Probe Homepage, NASA, updated August 20, 2014.https://wmap.gsfc.nasa.gov/universe/bb_cosmo_fluct.html.

[ii]Highlights June 2008, TOP500 List, updated June 18, 2008.https://top500.org/lists/top500/2008/06/.

[iii]Rob Farber, ed.,CUDA Application Design and Development(Morgan Kaufmann, 2012),https://www.sciencedirect.com/book/9780123884268/cuda-application-design-and-development.

[iv]Sparsh Mittal, A Survey of Techniques for Architecting and Managing GPU Register File,IEEE Transactions on Parallel and Distributed Systems28, no. 1 (January 2017): 16,https://doi.org/10.1109/TPDS.2016.2546249.

Rob Farber is a global technology consultant and author with an extensive background in HPC and machine learning technology development that he applies at national labs and commercial organizations.

Read more:

Verifying the Universe with Exascale Computers - HPCwire

Read More..

Wanted: Cybersecurity Professionals to Protect Businesses, the Nation’s Infrastructure – UNLV NewsCenter

At headquarters, Mai Vo opened a channel to receive the secret, encrypted message from her spy on the ground.

Within minutes, and with a password in hand, she decoded the message, placing her one step closer to achieving the mission: becoming a cyber star.

Vo, a rising sophomore at West Career & Technical Academy in Las Vegas, is one of 33 local high school students who got a crash course in cybersecurity thisweek during UNLVs third annual GenCyber Summer Camp.

I think the Earth is too mysterious, and I want to learn as much as I can about space, and Earth, and technology, said Vo. I find it interesting to learn how things work and why everything is as it is.

Though she has plans to study mechanical engineering in college one day, Vo said the camp opened her eyes to the possibility of exploring a career in cybersecurity - keeping hackers at bay and protecting the nations critical infrastructure, from power grids to transportation systems, from coming under attack.

And thats exactly what the camp hopes to achieve. With nearly half a million cybersecurity-related jobs open across the nation, UNLV is hoping to inspire the next generation of cybersecurity professionals and help to fill a critical skills gap.

One of the things that interested me the most is what they said when we asked about the salary, said Deven Slivka, a rising senior at Western High School. Its whatever you want. You can pick your salary if youre good enough.

Yoohwan Kim, camp co-director and UNLV computer science professor, said 70% of small companies go out of business after a cyber attack.

The chance of getting attacked is very, very high, and an attack can be happening for months before a company realizes it, Kim said. There is not enough protection.

Cybercrime is estimated to cost the world $10 trillion annually by 2025, said Ju-Yeon Jo, a computer science professor who co-leads the summer camp with Kim. They also head up, along with colleagues in engineering and business, UNLVs master in cybersecurity program which opened this spring.

Recruitment is crucial not only for businesses, but also for the protection of our nations infrastructure, Jo said.

Through activities like decoding encrypted spy messages and cyber treasure hunting, to learning what it means to be a good digital citizen, Jo and Kim hope the students become ambassadors for cybersecurity at their respective high schools.

When they go back to school they can be pioneers and create cybersecurity clubs or activities, Jo said.

Just two days into the weeklong experience, where participants enjoy daily prizes, team activities, and catered food,Vo already gave the camp her stamp of approval.

Its a great experience, and so worth it, Vo said. Im glad I fixed my sleep schedule for this.

UNLV GenCyber Camp is provided at no cost to participants thanks to a grant from the National Security Agency and the National Science Foundation. UNLV is one of 98 institutions offering camps across the country this summer.

Read the original:

Wanted: Cybersecurity Professionals to Protect Businesses, the Nation's Infrastructure - UNLV NewsCenter

Read More..

Hostwinds Review 2021: Not The Cheapest Web Hosting Option, But A Quality One – Forbes

Hostwinds boasts that it owns 100% of its servers, systems and structuresas such, the company can get issues resolved quickly and in a more cost-effective manner. Also, for this reason, Hostwinds claims it passes on the savings to its customers. All plans come with unlimited storage and bandwidth, free website migration and unlimited email accounts.

Its shared hosting plans come in three tiers:

Hostwinds also has hosting services specifically for businessesthe pricing tiers are similar to its regular shared plans. The main difference is that its business packages offer faster loading speeds with Litespeed web servers and optimize their network path selection to ensure fast loading speeds. Plans start at $10.49 per month with one domain.

Customers can also sign up for other hosting services, including VPS, cloud and dedicated server hosting. Both Linux and Windows servers are available. Prices range from will vary depending on the type of hosting and how much storage space, RAM and bandwidth are needed.

Hostwinds backs up your website each evening so that your important information and files will be secure. That way, if anything goes wrong like malware attacks, youll be able to restore your website to a fairly recent version. Youll also be able to keep your backups indefinitely and access them whenever you wish. This is rare for basic shared hosting plans, which may only include backups bi-weekly, weekly or not at all.

The shared and business hosting plans include Weebly, a drag and drop website builder. This tool is great if youre a small business owner who is starting out and wants to create simple websites yourself. You can customize the layout using various themes and ensure that its also mobile-ready.

Otherwise, you can use other website builders such as WordPress. Depending on your technical skill level or whether youre hiring someone to build your site.

Visit link:
Hostwinds Review 2021: Not The Cheapest Web Hosting Option, But A Quality One - Forbes

Read More..

Cloud Technology and Healthcare Evolution: Microsoft in the Spotlight HIT Consultant – HIT Consultant

Gerry Miller, CEO & Founder, Cloudticity

In April, software giant Microsoft made a lot of headlines announcing itsmultibillion-dollar acquisition of Nuance, the cloud-based clinical intelligence developer best known to healthcare providers for its Dragon and PowerScribe speech-recognition products.

Business analysts and reporters zeroed in on impressive financial details and utilization potential for ambient AI technologies in health settings. But more than anything, the deal shows how serious Microsoft is about its healthcare IT ambitions and how central itsAzurecloud service is to those goals.

Longbeforethe acquisition news (or even the launch of Microsoft Cloud for Healthcare last year), Microsoft has been aggressively investing in making its Azure cloud computing service attractive to healthcare for hosting, building, testing, deploying, and managing applications and services. Its worth noting that all Nuances leading speech-to-text healthcare products, designed to integrate nicely with electronic health record (EHR) systems, are software-as-a-service (SaaS) offerings built on Microsoft Azure.

The Cloud and Healthcare IT

In the age of digital transformation, the healthcare industry is leveraging the cloud for more than nifty EHR documentation services. Organizations need its flexibility to rapidly scale resources without big capital expenditures, build and host myriad applications, facilitate collaboration, generate clinical/operational insight, and deal with expanding volumes of health data. In the hyperconnected and data-deluged modern world, the cloud is really the only feasible option for computing and storage infrastructure moving forward in most industry sectors and that includes healthcare. Microsoft knows this.

But cloud utilization in healthcare comes with unique requirements health data are sensitive, protected, and subject to distinct regulatory constraints. In the US, maintainingHIPAAandONC Cures Act Final Rulecompliance and ensuring the privacy and security of as well as appropriate accessibility to protected health information (PHI) is compulsory.

And while public cloud providers like Microsoft Azure supplyguidanceand resources for designing CURES Act- and HIPAA-compliant environments, that doesnt mean that everything on Azure is automatically safe for healthcare use. Cloud utilization comes with shared responsibilities, and healthcare organizations using the cloud are responsible for their own regulatory compliance and data protection functions and processes.

The IaaS Shared Responsibility Model

In a traditional data center, the organization owns and is responsible for security entirely from physical space and server hardware to the network and data and applications. With Infrastructure-as-a-Service (IaaS) and public clouds like Azure, the security responsibilities are shared between the user (in this case, the healthcare organization) and the cloud infrastructure provider (Microsoft).

For example, Microsoft ensures that its physical infrastructure is secured, and assumes responsibility for hardware and facility access control across geographical locations. It also ensures that its Azure cloud service is fault-tolerant and reliable, with failover provisions for outages.

But customers using Azure are responsible for securing the data they put in the cloud and the way their applications behave (for example, by enforcing complex password policies and authentication measures to ensure that hackers cant easily break-in).

Microsoft will sign a HIPAA Business Associate Agreement (BAA) with Azure healthcare customers that define and covers in-scope services, as is required by law for HIPAA compliance. But the healthcare organization using Azure still bears responsibility for achieving and maintaining its state of HIPAA compliance and ensuring its cloud instances are configured correctly.

This IaaS shared-responsibility model is a lot like renting an apartment. The landlord may be responsible for the safety and soundness of the building as a whole, but youre still responsible for locking the door to your own apartment.

The Future of Healthcare IT

It may sound like all this requires outsized effort just to manage IT, but the truth is that modern healthcare IT is experiencing a complex evolution. There are many industry-specific considerations organizations must navigate to master cloud utilization, and regulatory compliance is only one of them. On the other side of all that effort lies technological capability that can profoundly transform day-to-day operations.

The upsides of cloud power are too significant to ignore: scalable, agile, cost-efficient technology resources running secure, reliable, and largely automated services that extend capabilities while actually reducing complexity.

Microsofts continued interest in the healthcare industry is a good thing and its cloud service is helping to drive a virtuous cycle in healthcare innovation. For example, automatic speech recognition is an incredibly compute-intensive function. Without Azures cloud power, would Nuance have even become a healthcare trailblazer worthy of such high valuation? The cloud model has enabled the development and use of tools that can listen as a doctor chats with a patient to automatically generate EHR documentation. Its pretty amazing when you think about it and it will power more evolutionary leaps in healthcare IT moving forward.

About Gerry Miller

Gerry Miller is CEO and founder of Seattle-based Cloudticity, a digital enablement partner for the healthcare industry. Gerry is a serial entrepreneur and healthcare fanatic with over 30 years in the technology industry. Prior to Cloudticity, Gerry was brought in as the chief operating officer at ePrize; he turned around a failing company that was eventually sold for a fourfold return on the initial private equity investment. Before ePrize, Gerry spent eight years at Microsoft, first as chief technology officer for the US central region, then running the global business unit that oversaw General Motors (Microsofts second-largest customer), growing that account from $20MM to over $100MM in three years. Prior to Microsoft, Gerry spent nearly a decade in the technology consulting and startup industry. He holds all five AWS certifications.

Read more:
Cloud Technology and Healthcare Evolution: Microsoft in the Spotlight HIT Consultant - HIT Consultant

Read More..

How We Cancel-Proofed Our Online Start-Up By Leaving The Cloud – The Federalist

Big Tech corporations worked in concert this past January to deplatform Parler, a fast-growing social network friendly to conservatives. Parlers vendors, such as Amazon Web Services, bowed to influential ideological forces and weaponized their Terms of Service, basing their actions on the expectation that Parler could not moderate content to AWSs satisfaction on a timely basis (a subjective standard impossible for any company of any size to meet).

When AWS enforced their evolving content moderation policies on Parler, with maximum consequences, the entire tech industry realized they could do it to anyone.Watching AWS and partners arbitrary enforcement against specifically and only Parler, while ignoring the many other offenders they host, highlighted the vulnerabilities any conservative company reliant upon the cloud faces.

CaucusRoom.com is a social network designed to help conservatives gather, encourage and engage locally. We are a small but growing player among conservative platforms that see a need, and a business opportunity. Operating on the Cancel Cloud posed a liability to our company legally, financially, and technically. Now unshackled from Big Techs chains, CaucusRoom is better off in every respect.

Tech startups use cloud platforms because they offer cost-effective, incremental, and instantly scalable access to hosting, infrastructure, data storage, and a host of other key services. However, these platforms and hosting providers tend to breed ecosystems of tightly integrated sub-vendors and service-boosters, which can severely disincentivize a company from operating outside of the ecosystem, colloquially known as a walled garden or vendor lock-in. In addition to the constellation of services it already offers, AWS owes much of its market dominance to its well-established network of cloud platforms and providers.

At CaucusRoom, not only did we subscribe to some AWS services, but nearly every other platform we subscribed to in turn extended AWSs services to host our website and core services, run our infrastructure, and store our data. These services include container orchestration, managed databases, caching systems, static content storage, load balancers, to name a few. The myriad contracts multiplied our vulnerability to a cancel moment, as each sub-vendor must adhere to AWSs terms in addition to their own terms.

When AWS deplatformed Parler, all of AWSs parasitic sub-vendors booted Parler as well. Imagine being booted off of a Mac when all of your programs are made for Mac, and you cant find any other computer designed to run your programs.

Fortunately for CaucusRoom, our tech stack was still small and nimble enough to maneuver off the cloud on our own terms. Within days of the Parler deplatforming, we received about a dozen calls from conservative-friendly data centers and tech vendors. We easily found a data center with owners anxious to help companies like ours. The customer service is fantastic. Every person we work with is someone weve personally met can you imagine saying that about a Big Tech company?

The move took about a month of preparation, testing, and transitioning. Backend infrastructure management is a different engineering discipline than the front-facing website seen by our users, but fortunately, our engineers spoke the language. If needed, our data center hosts also offer a team ready to personally help make the transition, and our monthly fee includes a few hours of their engineering time whenever needed. Now we are using the hardware we want, directly, and without a gaggle of woke gatekeepers.

CaucusRoom is now faster off of the cloud. We can move data around more easily to speed up queries. We can balance traffic in ways that reduce bottlenecks. Colocation of the core services for the site significantly increases responsiveness and eliminates the need for many third-party cloud services.

But what about the cost? The total monthly cost is just a few hundred dollars more per month (about 10 percent). Its a bargain given the engineering time saved, and the assurance offered to our investors and users.

By moving off of the cloud, we reduced our total number of online hosting vendors by 80 percent, while increasing our selection of possible data solutions. Each time we eliminate a data vendor, we eliminate a potential cancel moment that could paralyze our site.

If we do find ourselves in need of a move, shifting quickly to a new data center becomes much easier it would take minutes, instead of weeks, to untangle our data contracts and sign new ones. The same goes for the engineering required.

As a startup, eliminating potential risk is critical to raising investment. Its also important to reassure potential customers in our case, those customers include political campaigns and conservative causes looking for a new home away from Facebook. Conservative digital campaign directors know they are not safe from cancellation on Facebook or Twitter, and they need confidence that any new platform they use cannot be arbitrarily wiped off the internet.

In the chaotic world of politics, as in business, its important to focus on what you can control. After Parlers deplatforming, anyone with a website may control less than he previously thought. We encourage you to take an inventory of your websites tech stack. How many of your vendors, and their terms, are subservient to a Big Tech master?

Beware of vendor lock-in and keep your stack nimble. Get a few bids for services from conservative-friendly data centers. Most will bend over backward to help you cancel the cloud, and even help with any engineering needed to make the move. Your costs wont go up much, but if they do make sure and tell your investors and supporters youre hosted on a freedom-loving data center. They will likely double-down, reassured that youve reduced your risks.

In just a few weeks after moving off the Cancel Cloud, CaucusRoom added new investors and landed a major national network of conservative activists. Our product improved, our risks decreased, and our future capabilities expanded.

Matt Knoedler is the Co-Founder and CEO of CaucusRoom.com. Nathan Carlson is the companys Lead Engineer.

See the article here:
How We Cancel-Proofed Our Online Start-Up By Leaving The Cloud - The Federalist

Read More..

Only 50% of Amazon Is Retail – Marketplace Pulse

Amazons retail sales are down to only 50% of the companys total revenue. It now generates nearly as much revenue from its services businesses like AWS cloud hosting, Prime memberships, the third-party marketplace, and advertising.

In the second quarter, Amazons services business grew nearly three times faster than its retail sales. Amazon sold $53 billion worth of products online and $4 billion in physical stores like Whole Foods. Up 15% for a total of $57 billion. All services combined for $56 billion and were up 42% year-over-year.

Given how long Amazon has been growing its third-party marketplace, AWS cloud hosting, and other services businesses, it is perhaps surprising that it took the company until 2021 for retail sales, its original business, only to represent 50% of the total. But then, there are no critical reasons for it to forgo retail for the marketplace. And so it continues to do both.

Three years ago, in 2018, retail sales were 60% of Amazons revenue. The third-party marketplace, its second-largest business unit, has been growing faster than retail for years. The third-party marketplace business will soon be half as large as retail and will generate over $100 billion in revenue for Amazon in 2021.

Advertising has started to accelerate as brands appetite for ads is driving up ad prices. It was up 87% in the second quarter, accelerating for the fourth quarter in a row and posting the fastest growth in nearly three years. The business is now half as large as AWS cloud hosting.

The second quarter in 2020 saw sales on Amazon spike due to the pandemic tailwinds. Compared to that, this years second quarter saw expected weak growth (it would have been even lower if not for Amazon pulling Prime Day into the quarter). The current quarter will show slower growth still - Amazons guidance for the third quarter is 10% to 16%.

For the remainder of the year, the big issue is fulfillment. There, the company said that theyve been playing catch-up pretty much since the pandemic started. It added that units shipped to its warehouses - both retail and FBA by third-party sellers - have doubled in two years. Thus inventory restrictions affecting the marketplace are not going away anytime soon, and fulfillment will again be a factor in the fourth quarter.

The rest is here:
Only 50% of Amazon Is Retail - Marketplace Pulse

Read More..

The 6 Most Popular Types of Web Hosting Services in 2021 – London Post

Its no secret that entrepreneurs need high-performing websites. Your site is the key to boosting your visibility and scaling your company. To ensure that your website performs as it should, youll need to choose the best web hosting service.

Before you can even choose a provider, its important to learn about the different types of web hosting. To help you get up to speed, lets explore the most popular types of web hosting services.

Colocation means renting a space for your hardware and servers, from a third-party data centre. Usually, colocation providers include the physical space, security, cooling systems, networking, and redundant power. You pay to host your personal servers in a colocation, maintaining ownership of the servers, networking equipment, and the storage.

By using a colocation service you can limit the size of your data centre, and reduce the associated expenses. Colocation offers a scalable business solution, companies can access higher bandwidth levels, to support their growing web traffic.

Cloud Hosting uses the Internet to provide access to computing resources, websites and apps. It differs from traditional hosting because it does not rely on just one server.

Cloud-hosting offers a powerful solution, using a network of both physical and virtual servers. There are many cloud hosting benefits, including:

Some businesses want web hosting services that are optimized to their WordPress site. These businesses can choose WordPress Hosting, which generally falls under two categories.

Shared WordPress hosting is a shared model specifically designed for WordPress sites. Managed WordPress Hosting offers extra advantages, including staging, server caching, better security, and improved page speed. Improved security is particularly useful for WP sites.

WordPress is the most popular CMS in the world, hackers continually attempt to identify WordPress vulnerabilities and exploit them.

Shared Hosting provides a basic web hosting solution. When you choose this option youll be sharing resources with multiple websites, all on the same server.

Shared hosting is the least expensive hosting system. A large group of people are paying the server costs, meaning you can get a basic plan for around 2-12 per month.

Shared hosting sites are incredibly easy to set up, you wont need any tech knowledge to get started. These hosting packages are best for amateur bloggers and small startups. The downsides? You may experience slower page speeds, plus shared hosting isnt a particularly scalable solution.

Dedicated Hosting means that your site has its own very own server. With a server thats dedicated to your site alone, youll experience improved performance and ultimate flexibility. Like cloud hosting, dedicated hosting is incredibly powerful.

It can be an expensive option, so why would you pay out? Here are a few reasons:

Virtual Private Server hosting uses a shared server to replicate the experience of a dedicated server. Technically youre still using a shared physical server, however, your website gets a virtual space to call its own.

VPS systems offer improved performance compared with a shared hosting system. They arent quite as powerful as dedicated hosting systems, yet they can offer a similar experience, at a reduced price.

With a VPS system, you can adjust the environment and install apps without the support of the hosting provider. These hosting systems are private, scalable, cost-effective, and reliable.

Using tech you can improve your business in plenty of different ways. To enhance the performance of your site, you need the best web hosting solution. Colocation services and cloud-based services both offer powerful and cost-effective business solutions. If youre a new startup on a limited budget, you might prefer to start with a shared hosting service.

Whichever type of hosting service you choose, its advisable to compare a few different providers.

Read the rest here:
The 6 Most Popular Types of Web Hosting Services in 2021 - London Post

Read More..

HTML smuggling is the latest cybercrime tactic you need to worry about – TechRepublic

It will be hard to catch these smugglers, as they're abusing an essential element of web browsers that allow them to assemble code at endpoints, bypassing perimeter security.

Image: oatawa, Getty Images/iStockphoto

Cybersecurity company Menlo Labs, the research arm of Menlo Security, is warning of the resurgence of HTML smuggling, in which malicious actors bypass perimeter security to assemble malicious payloads directly on victims' machines.

Menlo shared the news along with its discovery of an HTML smuggling campaign it named ISOMorph, which uses the same technique the SolarWinds attackers used in their most recent spearphishing campaign.

SEE: Security incident response policy (TechRepublic Premium)

The ISOMorph attack uses HTML smuggling to drop its first stage on a victim's computer. Because it is "smuggled," the dropper is actually assembled on the target's computer, which makes it possible for the attack to completely bypass standard perimeter security. Once installed, the dropper grabs its payload, which infects the computer with remote access trojans (RATs) that allow the attacker to control the infected machine and move laterally on the compromised network.

HTML smuggling works by exploiting the basic features of HTML5 and JavaScript that are present in web browsers. The core of the exploit is twofold: It uses the HTML5 download attribute to download a malicious file that's disguised as a legitimate one, and it also uses JavaScript blobs in a similar fashion. Either one, or both combined, can be used for an HTML smuggling attack.

Because the files aren't created until they are on the target computer, network security won't pick them up as maliciousall it sees is HTML and JavaScript traffic that can easily be obfuscated to hide malicious code.

The problem of HTML obfuscation becomes even more serious in the face of widespread remote work and cloud hosting of day-to-day work tools, all of which are accessed from inside a browser. Citing data from a Forrester/Google report, Menlo Labs said that 75% of the average workday is spent in a web browser, which it said is creating an open invitation to cybercriminals, especially those savvy enough to exploit weak browsers. "We believe attackers are using HTML Smuggling to deliver the payload to the endpoint because the browser is one of the weakest links without network solutions blocking it," Menlo said.

SEE:How to manage passwords: Best practices and security tips (free PDF)(TechRepublic)

Because the payload is constructed directly in a browser at the target location, typical perimeter security and endpoint monitoring and response tools make detection nearly impossible. That's not to say that defending against HTML smuggling attacks is impossible, thoughit just means companies need to assume the threat is real and likely, and to construct security based on that premise, suggests U.K.-based cybersecurity firm SecureTeam.

SecureTeam makes the following recommendations for protecting against HTML smuggling and other attacks that are likely to pass with ease through perimeter defenses:

Strengthen your organization's IT security defenses by keeping abreast of the latest cybersecurity news, solutions, and best practices. Delivered Tuesdays and Thursdays

Read the rest here:
HTML smuggling is the latest cybercrime tactic you need to worry about - TechRepublic

Read More..

This business is enabling a new level of personalisation with cloud technology – YourStory

As India rides the wave of digital transformation, new trends in technology and business growth are emerging across sectors and geographies. Leading the charge around this transformation are Software as a Service (SaaS) companies, which have been leveraging the power of cloud technology in innovative ways to chart new frontiers of growth.

To showcase and understand the changes that the new India is undergoing, YourStory is hosting Sassy Saturdays, in association with AWS, featuring experts from across Indias SaaS ecosystem who will share their unique insights around this digital transformation and how it will shape the ecosystem.

In this edition of Sassy Saturdays, we look at how Inventa, a leading Deep-personalisation Platform developed by Cutting Chai Technologies (CCT) is reimagining communication between people, content, and businesses in an Online-Offline (O2O) world.

In the past decade, the wide adoption of smartphones and Internet-connected devices, has heralded a golden age for consumerism. The massive number of products and experiences available anytime and anywhere, meant advertisers and marketers needed an edge to grab the attention of consumers.

Recent consumer surveys have shown that such efforts stand a chance of driving a greater impact on consumers only when it speaks to them on a personal level. A Salesforce survey of consumers noted that 70 percent believed that a companys understanding of their personal needs influences their loyalty. In a recent Personalisation Pulse Check study by Accenture, over 90 percent consumers said they are more likely to shop with brands who recognize, remember, and provide relevant offers and recommendations.

Personalisation based marketing currently is a multi-billion dollar and fast-growing industry, with companies leveraging advanced analytics and technologies such as artificial intelligence and machine learning (AI/ML) to secure deep consumer insights and provide more personalised omnichannel experiences.

Anand Virani and Rohit Kapoor, both former Business leaders at Qualcomm, co-founded CCT after observing that while consumers spent significant amounts of time interacting within both their offline and online worlds, no one was providing real time personalised recommendations that leveraged a consumers online and offline footprints.

I saw a world in which the online and offline lives of humans and consumers converge on their mobile phones and smart devices and started to think deeply about the value this could create for both consumers and businesses globally. This led to creating a platform for smart devices that seamlessly blends the online and offline profile of users and matches this with the world around the user, to generate relevant and contextual communication on behalf of our customers, says Anand Virani, Founder and CEO, CCT.

Inventa, the companys flagship product, uses patented technology and proprietary algorithms to create a dynamic and holistic Online-Offline (O2O) profile of a user and to deliver precise, spam-free communication to the Users mobile and/or IoT device.

Businesses can integrate Inventas Software Development Kit (SDK) with their mobile apps to communicate with their customer based on his preferences at a suitable time and relevant location. The founders claim that Inventa powered communication significantly drives customer engagement and conversion across multiple industries and segments.

Digital adoption has accelerated rapidly over the last years and customers are comfortable interacting with businesses via their mobile applications. This gives businesses the opportunity to better understand their customers and offer them highly relevant offerings. Inventa enables powerful use cases across consumer facing industries including retail, telecom, financial services, hospitality etc. and delivers tangible value to the end customer and the enterprise, says Rohit Kapoor, Co-founder and COO, CCT.

For example, by using Inventa, sporting goods shops can send notifications on a sale on high-end running shoes to experienced runners in the park nearby. Hotel concierge services can provide personalised recommendations based on the guests preferences to make a vacation all the more memorable. Even credit card companies can earn greater brand loyalty by providing customers with personalised recommendations and offers based on their preferences to truly bridge the gap between online and offline experiences.

Right from its core infrastructure to its deep personalisation capabilities, AWSs cloud services play a central role in powering the Inventa Platform. To name a few, Amazon Simple Storage Service (Amazon S3) helps in meeting its storage requirements, while Inventas computing needs are augmented by Amazon Elastic Compute Cloud (Amazon EC2) instances. AWS Lambda, a serverless compute service from AWS, helps optimise the use of these computing resources.

Inventas O2O profiling, ML-led recommendations, and communication engine, which are key to its offerings, are also powered by cloud technology. What sets Inventa apart is that it stores and generates data based on a users online and offline interactions, integrates and analyses them to create rich, highly personalised profiles that are updated in real-time. These deep personalisation capabilities are augmented by Amazon Personalize, along with Inventas own set of analytics and personalisation algorithms. Inventa then shares these profiles to its customers, which help them to enhance the personalisation of their users experiences.

AWS has been a great cloud partner in our journey to reimagine customer personalisation for the online-offline era. The rich portfolio of AWS services that we have adopted has allowed us to rapidly adapt and deploy Inventa to the diverse needs of our customers. The Amazon Personalize service, with its ability to develop and deploy custom machine learning models, enables us to offer highly customised recommendations capabilities to customers across Industries. The support from the AWS team has been phenomenal, says Badal Shah, Product Management at Inventa.

Inventa is designed with customer data privacy and protection at its core.

Customer communication is triggered via explicit opt-ins and preference sharing. Further, the Inventa platform uses a modular design for the separation of customer data and microservices and offers hosting flexibility to accommodate the needs of multiple types of businesses.

These capabilities are delivered with a fully integrated, end-end and managed SaaS platform that allows businesses to set up and deploy at scale within days and not weeks.

With leading payment and retail apps in India and Southeast Asia as its clients, Inventa has been powering the personalisation of the experience of millions of consumers and thousands of physical stores across these regions.

The company has plans to go global with Inventa and is currently raising funds for serving opportunities in North America and the EU via strategic partners and direct deals.

Deep personalisation is the future of customer engagement. Businesses and brands need to build stronger relationships with their customers, by offering curated and personalised experiences that speak uniquely to their customers ever-changing and evolving preferences. This is critical in the post COVID era as businesses rebuild customer relationships across their Online and Offline touchpoints. With Inventa, we are excited to be leading this transformation for our customers and partners, says Anand.

Visit link:
This business is enabling a new level of personalisation with cloud technology - YourStory

Read More..