Steve Thomas - IT Consultant

Google Cloud today announced plans for its first cloud region in Mexico, its third in Latin America after Santiago, Chile and São Paulo, Brazil.

The new region, which will be Google’s 35th, will allow it to better serve its local users with lower-latency access to its cloud services, but — and these days, this may be even more important — offer these users data residency and compliance options.

“The cloud region in Mexico will unlock new possibilities for the use of cloud technologies by public sector organizations in the country,” said Juan Carlos Sarmiento Tovilla, the director general of Information Systems at Mexico’s Federal Court of Administrative Justice. “Different public entities would benefit from interoperating in an efficient and secure way, facilitating access to computing power and information technologies.”

In this context, it’s worth noting that Microsoft Azure also announced plans to open a region in Mexico (though that was in 2020 and it’s not open yet) and AWS also announced plans for a region in the country last year, as well as a local zone in Queretaro.

In addition to the new region in Mexico, Google also plans to open new regions in Doha (Qatar), Turin (Italy), Berlin (Germany), Dammam (Kingdom of Saudi Arabia) and Tel Aviv (Israel) in the near future.

At its Inspire conference, Microsoft today announced the launch of the Microsoft Cloud for Sovereignty, a new solution for public sector customers — especially in Europe — who need to be able to guarantee that their users’ data is stored and processed in a given region.

“Today, public sector customers can harness the full power of Microsoft Cloud, including broad platform capabilities, resiliency, agility and security,” the company explains in today’s announcement. “With the addition of Microsoft Cloud for Sovereignty, they will have greater control over their data and increased transparency to the operational and governance processes of the cloud.”

Image Credits: Microsoft

Users will be able to run their workloads in any of Azure’s more than 60 data center regions and access all of the standard Microsoft Cloud services (think Microsoft 365, Dynamics 365 and the Azure platform) and then enable residency options for all of these to meet their regulatory requirements. The company will also offer a ‘Sovereign Landing Zone’ to recommend and enforce compliance options using the same infrastructure-as-code and policy-as-code features are its Azure Landing Zone. This, the company argues, will make it easier for its public sector customers to get up and running, yet also give them the flexibility to tailor these policies to their needs.

The other major cloud providers, of course, also offer their own systems for ensuring compliance with data sovereignty regulations. Google recently announced its Sovereign Controls for its Workspace product, for example, while AWS offers all the primitives to build these features but also bets on its network of third-party vendors and consultants to enable these capabilities for its customers. Few, however, have put the same emphasis on data governance as Azure, which may just pay off in its quest to bring more lucrative public sector customers on board.

On a day when Netflix said it would launch an ad-supported service with Microsoft, it would appear the streaming world may, in fact, be poised to compete in popularity terms, as consumers begin to realise they can own digital assets, not merely rent or stream them.

Based on a new survey of 2,000 Americans (so statistically relevant), a new study found that 77% of Americans would rather own digital content and assets than just rent or stream. Indeed, some 28% are now spending over $49 a month on digital assets that they actually own, such as digital art, music and in-game items.

Does this represent a shift away from the streaming era as exemplified by Spotify and Netflix towards an era of ownership? It’s clearly still too early to tell definitively, and lets face it, a movie is very different to a Fortnite in-game skin. But there is a whiff of changing attitudes in this survey.

Suffice it to say that the survey, commissioned by blockchain-driven metaverse platform Virtua and assessed academically, uncovered a strong attachment to owning digital assets. The majority of those surveyed who had purchased digital items (65%) valued them the same level “as or more” than physical items. And three-quarters (78%) felt “emotionally attached” to them.

The majority of those who purchased digital items (61%) also preferred to receive a digital item (NFT or in-game ‘skins’) as a gift rather than a physical one. According to the survey, this is in part because digital assets were perceived as better for the environment (38% thought this), although whether they were aware about the energy costs in – for instance – minting NFTs was not expanded on.

Commenting in a statement, Jawad Ashraf, CEO, Virtua said: “The ‘possessions’ we will own and take into the metaverse – are evolving. With the introduction of Web 3.0, the meteoric rise of digital collectibles and the dawn of the Metaverse, we will value our digital items more than ever before.”

Furthermore, 70% said they “connect better” via digital items than physical ones, while 89% said losing these assets would be upsetting. 

Americans also like to own digital items because they “remind them of important moments in their lives (87%)” and “help create the perception of who they want to be” (73%).

Commenting, Dr Janice Denegri-Knott, Professor in Consumer Culture & Behavior at Bournemouth University said the survey found that more people are attached to their digital items than was previously thought.

“Virtua’s inaugural digital ownership report is based on the largest ever intergenerational survey of its kind… Some people assume we view digital ownership as less advantageous or desirable when compared to physical ownership but the findings of this study suggest otherwise,” she said.

“There is a growing acceptance of digital ownership with lines between owning digital and physical goods becoming blurred, in particular for younger cohorts,” she added.

The research also revealed different generations view digital assets and other digital items differently. Nine in ten (90%) of millennials (ages 24 – 42) feel emotionally attached to digital items, while three quarters (74%) view them as a good investment. 

They also say their digital identity is “important” (80%), or “very important” (47%).

Meanwhile, the younger Gen-Z group (aged 16 – 23) is the most active, with 30% selling digital items and 20% trading them. More than half (52%) of all Gen-Z Americans said they buy, sell or trade digital items.

The research was conducted by Censuswide, which surveyed 2,000 respondents aged 16+ in the USA, based on the ESOMAR survey principles. 

Virtua is the new rebrand name for Terra Virtua, a blockchain-driven VR entertainment Platform that has raised $2.5M to date, according to Crunchbase, and plans the launch of its own digital-assets driven Metaverse.

One more hurdle up ahead for Activision Blizzard, the games giant behind “Call of Duty” that Microsoft is looking to acquire for $68.7 billion. The UK’s Competition and Markets Authority has announced a formal investigation into the proposed deal. This opens the investigation up for feedback from “any interested party” ahead of the CMA deciding whether to embark on a phase-2, deeper enquiry into whether the deal is anticompetitive and represents antitrust violations in the U.K. Those interested parties have until September 1 to respond.

Areas that it will be assessing include whether the deal leads to higher prices, lower quality, or reduced choice in games and the gaming ecosystem.

“The Competition and Markets Authority (CMA) is considering whether it is or may be the case that this transaction, if carried into effect, will result in the creation of a relevant merger situation under the merger provisions of the Enterprise Act 2002 and, if so, whether the creation of that situation may be expected to result in a substantial lessening of competition within any market or markets in the United Kingdom for goods or services,” it notes.

If the acquisition goes ahead, it will be one of the biggest M&A deals ever in technology, led by one of the world’s biggest tech companies and with some of the world’s most popular brands in digital entertainment — which in addition to “Call of Duty” also include “World of Warcraft” and “Candy Crush.” So in that regard the antitrust investigation is a fairly routine move. As the CMA notes, “The deal is set to be reviewed by competition authorities around the world and, as is usual practice, the CMA will engage with its counterparts as appropriate.”

The CMA notes that its more formal criteria for assessing investigations is that it can do so in cases where two or more enterprises cease to be distinct and “either the UK turnover of the acquired business exceed £70 million, or the 2 businesses supply/acquire at least 25% of the same goods/services supplied in the UK and the merger increases the share of supply.”

The U.S. has a similar set of rules in place for triggering antitrust enquiries and so unsurprisingly, the FTC in the U.S. is also currently investigating the deal. The regulators have been known to scupper, or add provisions, to deals, as well as nod them through.

The CMA has played a significant role in recent years in the fate of several large tech companies’ business development strategies. It struck down the acquisition of Arm by Nvidia, but it nodded through Microsoft’s acquisition of Nuance for $20 billion. It’s currently also investigating Google’s adtech stack and at long last is looking into the duopoly in mobile in the country that is Apple and Google. Although it didn’t raise much noise in Facebook’s acquisition of WhatsApp several years ago, more recently it ordered the company to sell Giphy. One route to getting past regulatory concerns at the CMA has seen companies downsize their acquisitions, such as eBay and Adevinta did when selling off several assets to get their classifieds deal past them. 

The news of the antitrust probe comes at a time when Activision Blizzard has already weathered a number of controversies particularly around its labor and wider human relations practices.

When Microsoft’s bid for Activision was first announced in January of this year, quality assurance testers at Raven Software, a division of Activision, had been on strike for about five weeks over the termination of contracts for some workers and what they saw as unfair treatment by the company, given the stress that this group faces in their daily work. They voted to unionize in May, making it the first union in a major gaming company.

Beyond that, the company has been under intense public and regulatory scrutiny over workplace culture. The company, which overall employs about 10,000 people globally, was the subject of a two-year investigation by the California Department of Fair Employment and Housing. It eventually filed a lawsuit against Activision Blizzard in July 2021, claiming a “‘frat boy’ workplace culture” at the company and describing it as a “breeding ground for harassment and discrimination against women.” On top of this, Bobby Kotick, who has been the CEO of the company since 1991 (originally at Activision pre- its Blizzard merger), reportedly knew about, yet failed to act, over sexual misconduct and rape allegations.

To be clear, details of the HR drama at the company are not within the parameters of the antitrust investigation, but they do contribute to the bigger picture of a company under the gun.

Those who are giving feedback on the merger to the CMA can do so here and have until September 1 to make their submissions.

Six years on from the referendum where the United Kingdom voted to leave the EU, and in the midst of an apparent government meltdown, the country is announcing its first international data sharing deal: it’s inked an agreement with South Korea, which will allow organizations in the UK to transfer data to the Republic of Korea, and vice-versa, without restrictions.

“Data transfers” cover any and all digital services that might be provisioned in one country but used or run in the other. It covers data in services like GPS and smart devices, online banking, research, internet services, and more. South Korea is home to two of the world’s biggest tech and specifically mobile tech companies, Samsung and LG, and already represents some £1.33 billion ($1.6 billion) in international digital trade, the UK said.

“Today marks a huge milestone for the UK, the Republic of Korea and the high standards of data protection we share,” said UK Data Minister Julia Lopez in a statement. “Our new agreement will open up more digital trade to boost UK businesses and will enable more vital research that can improve the lives of people across the country.”

“I am honored to agree to this joint statement today. Strengthening cooperation between the UK and the Republic of Korea based on the shared recognition of high standards of protection can contribute to forming a healthier and more sustainable global data landscape,” added Republic of Korea Commissioner of the Personal Information Protection Commission Jong in Yoon.

South Korea was one of several countries earmarked for a so-called international data adequacy initiative aimed at “unlocking the benefits of free and secure cross-border data flows now the country has left the EU” — the others being the U.S., Australia and Singapore, the Dubai International Finance Centre and Colombia. “The government continues to make excellent progress in its discussions with other priority countries,” it said today.

Ironically, had UK remained in the EU, it would be no further along in the effort it has achieved today: South Korea already has a data adequacy deal with Europe.

Google, Mastercard and Microsoft were among the companies and outside experts advising the government on this deal as part of an International Data Transfer Expert Council formed earlier this year. The government argues that data transfers and the many regulations that have been built around them have led to “billions of pounds” of trade going “unrealized” due to navigating that landscape.

Specifically, UK’s Department of Digital, Culture, Media and Sport — which is overseeing the deal — said that the idea will be that now companies and organizations doing business across the two countries will be “able to share data freely and maintain high protection standards” while doing so. Given that the basics of the country’s respective data usage policies have theoretically been vetted and harmonized, parties will no longer have to deal with contractural safeguards, it said, including paperwork for International Data Transfer Agreements or Binding Corporate Rules.

Still, you could argue that the time it’s taken, the fact that it’s only covering one country that would have been a partner the UK could have had (sans Brexit) anyway, and that fact that today’s deal is still not fully done — it’s just in “principle” — pulls the rug a little from under the argument that Brexit will lead to a lot less red tape for the UK going forward when it comes to trade deals.

Getting the deals done with the rest of the priority list will be a start, however. The DCMS estimated that “data-enabled services” to the full list (which includes the U.S.) are currently worth more than £80 billion.

Last week I wrote about an AI startup that’s building technology that can alter, in real time, the accent of someone’s speech. But what if the AI goal instead is to make it possible for people speaking in whatever way they do, to be understood just as they are, and to remove some of the bias inherent in a lot of AI systems in the process? There’s a major need for that, too, and now a UK startup called Speechmatics — which has built AI to translate speech to text, regardless of the accent or how the person speaks — is announcing $62 million in funding to expand its business.

Susquehanna Growth Equity out of the U.S. led the round with UK investors AlbionVC and IQ Capital also participating. This is Series B is a big step up for Speechmatics. The company was originally spun out back in 2006 of AI research in Cambridge by founder Dr. Tony Robinson, and prior to this had only raised around $10 million (Albion and IQ are among those past backers, along with the CIA-backed In-Q-Tel and others).

In the interim it has built up a customer base of some 170 — it only sells B2B, to power consumer-facing or business-facing services — and while it doesn’t disclose the full list, some of the names include what3words, 3Play Media, Veritone, Deloitte UK, and Vonage, which variously use the tech not just for making transcriptions in the traditional sense; but for taking in spoken words to help other aspects of an app function, such as automatic captioning, or to power wider accessibility features.

Its engine today is able to translate speech to text in 34 languages, and in addition to using the funding both to continue improving the accuracy there, and for business development, it will also be adding in more languages and looking at different use cases, such as building speech to text that can be used in the more tricky environment of motor vehicles (where motor noise and vibrations impact how AIs can ingest the sounds).

“What we have done is gather millions of hours of data in our effort to tackle AI bias. Our goal is to understand any and every voice, in multiple languages,” said Katy Wigdahl, the CEO of the startup (a title she co-held with Robinson, who has since stepped back from an executive role recently).

This manifests in the company’s product focus as well as its mission, and that’s something it’s also looking to expand.

“The way we look at language is global,” Wigdahl said. “Google will have a different pack for every version of English but our one pack will understand every one.” It initially only made its tech available by way of a private API that it sold to customers; now in an effort to bring in more users and potentially more paying users, it’s also offering more open API tools to developers to play with the tech, and a drag-and-drop sampler on its site.

And indeed, if one of Speechmatics’ challenges is in training AI to be more human in its understanding of how people speak, the other is to carve out a name for itself against other major providers of speech-to-text technology.

Wigdahl said company today competes against “big tech” — that is, major companies like Amazon, Google and Microsoft (which now has Nuance) that have build speech recognition engines and provide the tech as a service to third parties.

But it says it consistently scores better than these in tests for being able to comprehend when languages are spoken in the many ways that they are. (One test it cited to me was Stanford’s ‘Racial Disparities in Speech Recognition’ study, where it recorded “an overall accuracy of 82.8% for African American voices compared to Google (68.6%) and Amazon (68.6).” It said that “equates to a 45% reduction in speech recognition errors — the equivalent of three words in an average sentence. It also provided TC with a “competitor weighted average”: 

There is indeed a massive opportunity here, though, when you consider that between smaller developers and massive, outsized technology giants like Apple, Google, Microsoft and Amazon there are hundreds of giant companies that might not be quite at the level (or interest) of building in-house AI for this purpose, but if you take for example a company like Spotify, are definitely are interested in it, and definitely would prefer not to be reliant on those huge companies, which are also sometimes their competitors, and sometimes their outright foils. (To be clear, Wigdahl did not tell me Spotify was a customer, but said that that is a typical example of the kind of size and situation in which someone might knock on Speechmatics’ door.)

That too has been partly why investors are so keen to fund this company. Susquehanna has a history of backing companies that look like they might give the power players a run for their money (it was an early and big backer of Tik Tok).

“The Speechmatics team are undoubtedly a different pedigree of technologists,” said Jonathan Klahr, MD of Susquehanna Growth Equity, in a statement. “We started tracking Speechmatics when our portfolio companies told us that again and again Speechmatics win on accuracy against all the other options including those coming from ‘Big Tech’ players. We are primed to work with the team to ensure that more companies can get exposed to and adopt this superior technology.” Klahr is joining the board with this round.

Indeed, as tech becomes more naturalized and those making it look for more ways to reduce any and all friction that there might be around usage of that tech, voice has emerged as a major opportunity point, as well as a pain point. So having tech that works in “reading” and understanding all kinds of voices can potentially get applied in all kinds of ways.

“Our view is voice will become the increasingly dominant human-machine interface and Speechmatics are the category leaders in applying deep learning to speech, with category defining accuracy and understanding across industry use-case and requirements,” added Robert Whitby-Smith, a partner at AlbionVC. “We have witnessed the impressive growth of the team and product over the last few years since our Series A investment in 2019 and as responsible investors we are delighted to support the company’s inclusive mission to understand every voice globally.” 

Incredibuild, an Israeli startup that has picked up a lot of traction in the worlds of gaming and software development for a platform that drastically speeds up (and reduces the cost of) the shipment of code and related collateral during building and testing — has raised some capital to speed up its own development. The company has picked up $35 million in a Series B round of funding — money that it will be using for product development, as well as to strengthen its ecosystem with more investment into community, developer relations and cloud programs across more markets.

This all-equity round is being led by Hiro Capital, with past backer Insight Partners also participating. We understand from sources close to the startup that the money is coming with a doubling of its valuation: when Incredibuild last raised funds — $140 million in March 2021 led by Insight, which took a big stake in the company at the time — it was at a valuation of $300 to $400 million. The company has doubled its ARR in the last year, and although it doesn’t disclose the actual figure, this round likely puts its current valuation at close to $800 million.

If it sounds odd that a Series B would be so much smaller than the Series A, that’s in part because that previous round was a mix of debt and equity: the company had raised very little since being founded in 2000 and was profitable.

These more recent rounds have been to give the business — which counts companies like Epic, EA, Nintendo, Sony, Microsoft, Adobe and Citibank among its 1,000 customers — capital to build new products on top of those that were already doing well. (Hiro is a VC that focuses on gaming, creator platforms and metaverse technology; and so it can potentially help on that front.)

One example of how Incredibuild has been evolving its product is the company’s deeper move into the cloud. Incredibuild’s first iterations, and still one of the biggest use-cases, were aimed at helping organizations distribute compute across their own on-premise machines.

In a concept not unlike (but not exactly like) peer-to-peer networking, the idea is that there is idle CPU in organizations’ network at any given time, and so Incredibuild has built a way both to identify those idle gaps, and to effectively divide up heavy code and distribute it to run across those CPUs in real time, and to then be reintegrated at a final end point. Over time, that also incorporated cloud compute.

“It’s a flavor of grid computing,” Tami Mazel Shachar, the CEO, in an interview, “but the secret sauce is Incredibuild’s approach to parallelization and virtualization. Nothing needs to be installed on the remote computer.”

And most recently, in the last year, following what some of its customers are doing, it has made an even deeper move into cloud: it has now inked partnerships with AWS and Microsoft integrating the Incredibuild tech directly into gaming stacks run in those companies’ respective cloud platforms, the idea being that using many pieces of small compute in the cloud simultaneously works out to be cheaper and now faster than simply running a process over a platform’s biggest single compute platform. 

“If I have a heavy process, millions of lines of code, that would take a 64-core machine to process, it’s considered expensive and will run 10 hours,” said Shachar. “But if I take 400 4-core machines and run that for five minutes it is cheaper, shorter and running in less time.”

She added that it has yet to provide tools to companies to run compute over different cloud providers, and has yet to build a similar deep integration with Google’s cloud platform: the demand from customers for either of those use cases is not there (not yet, at least).

And although cloud is growing in use, the real story still seems to be a lot of motivation to get the most out of on-premise equipment.

“Most of our users are on-prem and then burst to cloud when they have a peak or need,” she said.

The bigger picture for why Incredibuild has been growing well is because its product addresses three key factors in the market today, Shachar said.

The first is that, if you believe that “the metaverse” is more than just a marketing concept, it will require significantly more compute power, and as many organizations are coming to realize, the solution to that will not rely on hardware alone, but also software that can intelligently optimize the usage of existing hardware.

That is related to the second factor, which is that it’s going to be hard to continue relying on hardware because of the chip shortage.

The third factor is that the growing drive for more media-heavy code and more digitized services overall is seeing a massive strain in terms of human capital: there are not enough software developers out there. That is driving a market for more software automation, to take out some of the busy work.

Interestingly, the other big theme in distributed computing has been the big push around decentralization in finance, specifically in areas like cryptocurrency. This is not something that Incredibuild has really touched yet, but I asked if its cheaper and more efficient approach to distribution could ever be applied there, given what a bad rap crypto mining has had for the energy and other resources that it consumes.

“The idea of crypto has been looked at,” Shachar said. “It’s not in our near future, but definitely an option. It’s a question of focus.”

The fact that its focus so far has gotten Incredibuild to a pretty good place as a startup and cash-generating business is an indication that it could well be on the right track.

“Games companies are feeling the squeeze in developer capacity. Incredibuild gives developers back precious time by accelerating build compilation,” said Cherry Freeman, co-funding partner at Hiro Capital, in a statement. “Amazing games companies like Tencent, Take Two, EA, Konami, Nintendo, Capcom, and WB Games are already reaping the benefits of Incredibuild and our hope is that more companies will discover and take advantage of their brilliant technology. As always, Games are the cutting edge for technological advancement, and we envisage a future where Incredibuild will be the de facto distributed supercomputer on every machine in every company.”

As companies grow they expose more of themselves online and become harder to defend in terms of cybersecurity. One report estimates that 30%-40% of a company’s IT infrastructure isn’t even known about by the security team.

So startups have appeared with an ‘offensive’ profile in order to similate cyber attacks.

One such is Amsterdam-based Hadrian, a ‘hacker-led’ cybersecurity startup that offers a SaaS platform which simulates an attack.

It’s closed a €10.5 million seed round led by HV Capital, with participation from Picus Capital, Slimmer AI and angels including Adriaan Mol, Koen Köppen and Niklas Hellman.

Hadrian’s view is that conventional “pen testing” is time and labor intensive, and tends to focus on the areas that companies already believe to be vulnerable. Hadrian’s says its platform scans the companies It infrastructure to look for weaknesses from the outside-in to create insights on digital threats and attack vectors.

Rogier Fischer, CEO at Hadrian, said in a statement: “Hadrian understands that CISOs and their teams can’t be expected to attend to every potential threat across the attack surface. Our autonomous technology identifies real threats and prioritizes where action is needed, connecting urgent tasks to existing workflow tools and processes so that the important stuff gets handled first.”

Hadrian competes with Randori (raised$29.8M), RiskIQ (acquired by Microsoft), Cortex and Cycognito (raised $153M).

Fischer told me: “Right now the value of ASM is extremely difficult to extract. There’s massive amounts of data that analysts will have to comb through. We’d argue that the value given right now by ASM therefor doesn’t warrant the price (hence insurance companies don’t use the data yet, or at least they’re not outperforming insurance companies with it.)”

Backing up the data on your Windows 10 PC should be done regularly, as doing so allows you to restore and recover important files in case of a disaster. Unfortunately, many users fail to back up their data until it’s too late. This article will guide you through the process of backing up and restoring your data in Windows 10.

Setting up File History in Windows 10

File History is a Windows feature that allows you to create scheduled backups of your data on a removable storage device (e.g., external hard drive, flash drive). To set up File History, click Start > Settings > Update & Security > Backup > Back up using File History > Add a drive.

Windows will search and display a list of ‌drives connected to your computer. Choose the one you want to use, then click on More options. This will take you to the Backup options screen where you can set up the parameters for your backup.

Under Back up my files, you can choose how frequently Windows will backup your files. Keep my backups lets you determine how long you want to keep your saved data.

File History is designed to save common folders such as Downloads, Desktop, and Music by default. If you want to add more folders to save, scroll down the Backup options window and click Add a folder. A list of folders will appear and you can choose which ones you want to add.

To remove a folder, just review the list of folders under Backup options, select the folders you want to exclude and click Remove.

After setting up your backup parameters, click Back up now. Windows will create a backup of the folders you included in the Backup options. Once done, the system will give you an overview of the size of the backup and the time and date it was created.

Restoring your files in Windows 10

If one or more of your files disappear or get corrupted, you can use the File History backup you created to restore them. Click Start > Settings > Backup > More options > Restore files from a current backup.

You’ll see a list of all the saved folders. Choose the ones you want to restore, then click the green button at the bottom of the window to restore the folders. If the original files and folders on your computer disappeared, Windows will restore them to their previous locations.

If the original files and folders still exist but are corrupted or inaccessible, Windows will give you the option to keep the original files, replace them, or compare them. Replacing corrupted files with the ones from your File History backup will allow you to access the most recent version of the file before it was backed up.

In case you want to stop using File History, just go to Start > Settings > Backup > Back up using File History, and turn off the Automatically back up my files option.

Backing up your data is an essential task that needs to be done regularly. Call our experts today to learn more about saving and restoring data in Windows.

Data loss can affect anyone. In fact, data stored on your Windows 10 computer can disappear in the blink of an eye. Backing up your data is the best defense against data loss, regardless of whether it’s due to a malware attack, hardware failure, or a natural disaster. Here’s how you can back up and restore your data in Windows 10.

Setting up File History in Windows 10

File History is a Windows feature that allows you to create scheduled backups of your data on a removable storage device (e.g., external hard drive, flash drive). To set up File History, click Start > Settings > Update & Security > Backup > Back up using File History > Add a drive.

Windows will search and display a list of ‌drives connected to your computer. Choose the one you want to use, then click on More options. This will take you to the Backup options screen where you can set up the parameters for your backup.

Under Back up my files, you can choose how frequently Windows will backup your files. Keep my backups lets you determine how long you want to keep your saved data.

File History is designed to save common folders such as Downloads, Desktop, and Music by default. If you want to add more folders to save, scroll down the Backup options window and click Add a folder. A list of folders will appear and you can choose which ones you want to add.

To remove a folder, just review the list of folders under Backup options, select the folders you want to exclude and click Remove.

After setting up your backup parameters, click Back up now. Windows will create a backup of the folders you included in the Backup options. Once done, the system will give you an overview of the size of the backup and the time and date it was created.

Restoring your files in Windows 10

If one or more of your files disappear or get corrupted, you can use the File History backup you created to restore them. Click Start > Settings > Backup > More options > Restore files from a current backup.

You’ll see a list of all the saved folders. Choose the ones you want to restore, then click the green button at the bottom of the window to restore the folders. If the original files and folders on your computer disappeared, Windows will restore them to their previous locations.

If the original files and folders still exist but are corrupted or inaccessible, Windows will give you the option to keep the original files, replace them, or compare them. Replacing corrupted files with the ones from your File History backup will allow you to access the most recent version of the file before it was backed up.

In case you want to stop using File History, just go to Start > Settings > Backup > Back up using File History, and turn off the Automatically back up my files option.

Backing up your data is an essential task that needs to be done regularly. Call our experts today to learn more about saving and restoring data in Windows.

Whether you use a Windows 10 laptop or desktop for work or personal use, you’ll realize that it can quickly accumulate a large amount of data. Hardware failure, cyberattacks, and natural disasters can put this data at risk of being lost forever. Backing up your data regularly is the best way to prevent data loss, and here’s how you can do it.

Setting up File History in Windows 10

File History is a Windows feature that allows you to create scheduled backups of your data on a removable storage device (e.g., external hard drive, flash drive). To set up File History, click Start > Settings > Update & Security > Backup > Back up using File History > Add a drive.

Windows will search and display a list of ‌drives connected to your computer. Choose the one you want to use, then click on More options. This will take you to the Backup options screen where you can set up the parameters for your backup.

Under Back up my files, you can choose how frequently Windows will backup your files. Keep my backups lets you determine how long you want to keep your saved data.

File History is designed to save common folders such as Downloads, Desktop, and Music by default. If you want to add more folders to save, scroll down the Backup options window and click Add a folder. A list of folders will appear and you can choose which ones you want to add.

To remove a folder, just review the list of folders under Backup options, select the folders you want to exclude and click Remove.

After setting up your backup parameters, click Back up now. Windows will create a backup of the folders you included in the Backup options. Once done, the system will give you an overview of the size of the backup and the time and date it was created.

Restoring your files in Windows 10

If one or more of your files disappear or get corrupted, you can use the File History backup you created to restore them. Click Start > Settings > Backup > More options > Restore files from a current backup.

You’ll see a list of all the saved folders. Choose the ones you want to restore, then click the green button at the bottom of the window to restore the folders. If the original files and folders on your computer disappeared, Windows will restore them to their previous locations.

If the original files and folders still exist but are corrupted or inaccessible, Windows will give you the option to keep the original files, replace them, or compare them. Replacing corrupted files with the ones from your File History backup will allow you to access the most recent version of the file before it was backed up.

In case you want to stop using File History, just go to Start > Settings > Backup > Back up using File History, and turn off the Automatically back up my files option.

Backing up your data is an essential task that needs to be done regularly. Call our experts today to learn more about saving and restoring data in Windows.

When Fermyon’s founders were working at Microsoft, they helped build a lot of cloud native technologies. They noted that the development process, particularly around Kubernetes, was complex and developers often over provisioned cloud infrastructure resources for those times when usage spiked, resources that often went unused.

That was a costly hedge for companies and developers, leaving servers unused they were still paying for. As the founders listened to customers at Microsoft, they realized that they were hearing a blueprint for a product that developers were looking for, and they left the company last year to begin building it.

Today that company announced the launch of the Fermyon platform and a $6 million seed investment

If you’re thinking the solution sounds a lot like serverless, you’re not wrong, but Matt Butcher, co-founder and CEO at Fermyon, says that instead of forcing a function-based programming paradigm, the startup decided to use WebAssembly, a much more robust programming environment, originally created for the browser.

Using Web Assembly solved a bunch of problems for the company including security, speed and efficiency in terms of resources. “All those things that made it good for the browser were actually really good for the cloud. The whole isolation model that keeps WebAssembly from being able to attack the hosts through the browser was the same kind of [security] model we wanted on the cloud side,” Butcher explained.

What’s more, a WebAssembly module could download really quickly and execute instantly to solve any performance questions, and finally instead of having a bunch of servers that are just sitting around waiting in case there’s peak traffic, Fermyon can start them up nearly instantly and run them on demand.

So the idea was to take the best of serverless and microservices, and combine them on this new platform that mostly removed Kubernetes from the management side of things and replaced it with a much simpler programming environment.

“What we really wanted was the serverless experience, right? Write a function, write a tiny program and pick your own language, but we wanted the runtime that executed it to be far more flexible and more cost effective, faster, and easier to move around inside of a data center,” he said.

They started by releasing a tool called Spin, which is an open source WebAssembly framework designed for individual developers to interact with the platform. “Spin is an instrumental piece of Fermyon that makes it easy for users to run production workloads with WebAssembly, and it achieved 1,000 GitHub stars within the first 6 weeks after its release,” according to the company.

Today, the company is introducing Fermyon, the next open source piece, which allows teams to work together on the platform. The startup launched at the end of last year and started with 10 engineers on the first day. Butcher hopes to hire 15 people this year as the company develops. The plan is to build the open source community for starters, then once that’s established to start working on commercial pieces.

He said that earlier this year, the company had an offsite, and defined its values as an organization, and one of the core values was diversity. He recognized that having a core group of founders, who all came from a similar background, could lead to insulation and they wanted to make sure everyone who came on board felt welcome and included.

“We want to make sure we’re including women and minorities as we hire, right? We want to make sure that we’re including people in different time zones as we communicate. We want to make sure we’re including the people who are new to the team. And so as we talked through a lot of that at our off site, we basically explored the best practices we can use to amp up every one of these dimensions,” he said.

That includes pairing new employees with someone more experienced, regardless of the business function, being aware of time zone differences, recording every meeting and posting them in Slack, so people can go back and find missing information. In addition, he starts each meeting by checking in with people and seeing how they are doing before launching into work. All of this designed to make everyone feel a part of the team.

The $6 million seed investment was led by Amplify Partners with participation from a host of industry angels.