Steve Thomas - IT Consultant

Solar generators are all the rage, but what do you do when the clouds roll in? This gadget will keep your power running.

  • Microsoft says a new threat actor started targeting critical infrastructure
  • The group is linked to Silk Typhoon
  • It engages in spear phishing and vulnerability exploits

Storm-0227, a Chinese state-sponsored advanced persistent threat (APT) actor started targeting critical infrastructure organizations, as well as government entities, in the United States.

This is according to Sherrod DeGrippo, director of threat intelligence strategy at Microsoft.

Speaking to The Register recently, DeGrippo said that the group abuses software vulnerabilities and engages in spear phishing attacks to gain access to people’s devices.

Commodity malware

Once they get the access, they deploy different Remote Access Trojans (RAT) and other malware to obtain login credentials for services such as Microsoft 365. They also steal sensitive documents and whatever else they can get their hands on. The goal of the campaign is cyber-espionage.

An interesting thing about Storm-0227 is that it uses off-the-shelf malware which, a few years ago, would come as quite the shock: “Even national-aligned threat actors … are pulling commodity malware out of that trading ecosystem and using it for remote access," she told the publication. Half a decade ago "that was sort of a shocking thing to see a nation-sponsored, espionage-focused threat actor group really leveraging off the shelf malware," she added. "Today we see it very frequently."

There was no word on the number of victims, but DeGrippo described the group as an “embodiment of persistence”.

"China continues to focus on these kinds of targets," she said. "They're pulling out files that are of espionage value, communications that are contextual espionage value to those files, and looking at US interests."

Storm-0227 seems to overlap, at least in part, with Silk Typhoon, it was further said. There is a whole list of “typhoon” threat actors, all on the payroll of the Chinese government, and all apparently tasked with spying on western governments, critical infrastructure firms, and other areas of interest (military, aerospace, and similar).

That includes Volt Typhoon, Salt Typhoon, Flax Typhoon, and Brass Typhoon. Salt Typhoon was recently linked to a number of high-profile breaches, including at least four major US telecom operators.

Via The Register

You might also like

The company has officially extended the software support policy around its older Pixel phones, including some fan favorites.
I spent a week with the distribution and found it to be a lot of fun to use. Here's what to know about it before you dive in.

As technology evolves, businesses are increasingly turning to cloud solutions to stay competitive because of their scalability, cost-effectiveness, and ease of management. However, legacy applications are often overlooked when considering cloud adoption due to the perceived challenges involved in migration. In this blog, we will explore various strategies to bring legacy applications seamlessly into your cloud environment so you can modernize your systems.

Assess your current systems

Assessing your current systems enables you to determine which applications are suitable for migration and identify any compatibility issues that may arise during the process. Make an inventory of all your existing applications, along with their versions, dependencies, and usage patterns. Next, bring in developers and software experts to analyze the codebase, identify potential roadblocks, and determine if the applications can be rebuilt to function in a cloud environment.

Choose a migration strategy

There are several approaches you can take when migrating legacy applications to the cloud, including:

  • Rehosting: Move the application as is to the cloud without making any changes. It is a quick and cost-effective option but may not fully utilize the capabilities of the cloud.
  • Application evolution: Refactoring or making incremental changes to the application makes it more cloud-friendly. This approach is suitable for applications with a long lifespan, and it can be performed gradually.
  • Application modernization: Rebuild the application from scratch using cloud-native architecture. This approach provides maximum benefits but requires a significant investment of time and resources.

The option you choose will depend on factors such as the complexity and criticality of the application, budget, timeline, and resources available.

Create a project timeline

Timing is crucial when migrating legacy applications to the cloud. You don’t want to disrupt business operations or expose your systems to potential risks during the migration process. Consider the dependencies between applications, peak usage periods, the complexity of the development and testing phase, and data migration that may impact the project timeline. The duration of the migration varies depending on the size and complexity of the application. If the legacy application is core to your business, the process may take several months, while smaller applications can be migrated within a shorter timeframe.

Develop a data migration plan

Data is often the most critical aspect of an application, and any loss or corruption during the migration phase can set you farther back than anticipated. Identify all the data sources and their formats, and determine how they will be migrated to the cloud. Companies can perform a bulk data migration or opt for a phased approach, depending on the size and complexity of their data. Back up your data before the migration process to avoid any data loss or corruption. It’s also important to test the migrated data in the cloud environment to ensure its integrity before going live.

Execute and monitor the migration

When all your plans are in place, it’s time to begin the migration process. Start with code refactoring, if necessary, and then move on to application rehosting or modernization. As the applications are migrated, move your data to the cloud and thoroughly test the functionality of the applications. Monitor the migration process closely, keeping an eye out for any errors or bugs that may arise. It may be wise to perform the migration in phases rather than all at once to minimize potential risks.

Conduct post-migration testing and optimization

The work doesn’t end once your application has been migrated; post-migration testing is crucial to confirm the success of the transition. Assess the data integrity, performance, functionality, and security of your new cloud environment and adjust configurations where necessary. You should also provide detailed walkthroughs and practical training on how to use the new system to increase user adoption and optimize the user experience.

There’s a lot that goes into migrating legacy applications to the cloud, but cloud professionals can make the process so much easier. Contact us to seamlessly move your legacy applications to the cloud. We’ll handle all the technical details so you don’t have to.

Outdated legacy applications can hold businesses back, hindering growth and innovation. But with the power of cloud technology, you can breathe new life into your aging software systems. Migrating legacy applications to the cloud not only addresses operational and security concerns but also unlocks a world of possibilities. This guide will provide a basic overview of the steps and strategies involved in successfully migrating legacy applications to the cloud.

Assess your current systems

Assessing your current systems enables you to determine which applications are suitable for migration and identify any compatibility issues that may arise during the process. Make an inventory of all your existing applications, along with their versions, dependencies, and usage patterns. Next, bring in developers and software experts to analyze the codebase, identify potential roadblocks, and determine if the applications can be rebuilt to function in a cloud environment.

Choose a migration strategy

There are several approaches you can take when migrating legacy applications to the cloud, including:

  • Rehosting: Move the application as is to the cloud without making any changes. It is a quick and cost-effective option but may not fully utilize the capabilities of the cloud.
  • Application evolution: Refactoring or making incremental changes to the application makes it more cloud-friendly. This approach is suitable for applications with a long lifespan, and it can be performed gradually.
  • Application modernization: Rebuild the application from scratch using cloud-native architecture. This approach provides maximum benefits but requires a significant investment of time and resources.

The option you choose will depend on factors such as the complexity and criticality of the application, budget, timeline, and resources available.

Create a project timeline

Timing is crucial when migrating legacy applications to the cloud. You don’t want to disrupt business operations or expose your systems to potential risks during the migration process. Consider the dependencies between applications, peak usage periods, the complexity of the development and testing phase, and data migration that may impact the project timeline. The duration of the migration varies depending on the size and complexity of the application. If the legacy application is core to your business, the process may take several months, while smaller applications can be migrated within a shorter timeframe.

Develop a data migration plan

Data is often the most critical aspect of an application, and any loss or corruption during the migration phase can set you farther back than anticipated. Identify all the data sources and their formats, and determine how they will be migrated to the cloud. Companies can perform a bulk data migration or opt for a phased approach, depending on the size and complexity of their data. Back up your data before the migration process to avoid any data loss or corruption. It’s also important to test the migrated data in the cloud environment to ensure its integrity before going live.

Execute and monitor the migration

When all your plans are in place, it’s time to begin the migration process. Start with code refactoring, if necessary, and then move on to application rehosting or modernization. As the applications are migrated, move your data to the cloud and thoroughly test the functionality of the applications. Monitor the migration process closely, keeping an eye out for any errors or bugs that may arise. It may be wise to perform the migration in phases rather than all at once to minimize potential risks.

Conduct post-migration testing and optimization

The work doesn’t end once your application has been migrated; post-migration testing is crucial to confirm the success of the transition. Assess the data integrity, performance, functionality, and security of your new cloud environment and adjust configurations where necessary. You should also provide detailed walkthroughs and practical training on how to use the new system to increase user adoption and optimize the user experience.

There’s a lot that goes into migrating legacy applications to the cloud, but cloud professionals can make the process so much easier. Contact us to seamlessly move your legacy applications to the cloud. We’ll handle all the technical details so you don’t have to.

Legacy applications, referring to older software systems that may have been developed using outdated technologies, can pose operational, security, and cost concerns for businesses. Migrating these applications to the cloud offers numerous benefits to your operations. However, cloud migrations can be tricky without proper planning and execution strategies, which is why it is important to follow a structured approach when bringing legacy applications into the cloud.

Assess your current systems

Assessing your current systems enables you to determine which applications are suitable for migration and identify any compatibility issues that may arise during the process. Make an inventory of all your existing applications, along with their versions, dependencies, and usage patterns. Next, bring in developers and software experts to analyze the codebase, identify potential roadblocks, and determine if the applications can be rebuilt to function in a cloud environment.

Choose a migration strategy

There are several approaches you can take when migrating legacy applications to the cloud, including:

  • Rehosting: Move the application as is to the cloud without making any changes. It is a quick and cost-effective option but may not fully utilize the capabilities of the cloud.
  • Application evolution: Refactoring or making incremental changes to the application makes it more cloud-friendly. This approach is suitable for applications with a long lifespan, and it can be performed gradually.
  • Application modernization: Rebuild the application from scratch using cloud-native architecture. This approach provides maximum benefits but requires a significant investment of time and resources.

The option you choose will depend on factors such as the complexity and criticality of the application, budget, timeline, and resources available.

Create a project timeline

Timing is crucial when migrating legacy applications to the cloud. You don’t want to disrupt business operations or expose your systems to potential risks during the migration process. Consider the dependencies between applications, peak usage periods, the complexity of the development and testing phase, and data migration that may impact the project timeline. The duration of the migration varies depending on the size and complexity of the application. If the legacy application is core to your business, the process may take several months, while smaller applications can be migrated within a shorter timeframe.

Develop a data migration plan

Data is often the most critical aspect of an application, and any loss or corruption during the migration phase can set you farther back than anticipated. Identify all the data sources and their formats, and determine how they will be migrated to the cloud. Companies can perform a bulk data migration or opt for a phased approach, depending on the size and complexity of their data. Back up your data before the migration process to avoid any data loss or corruption. It’s also important to test the migrated data in the cloud environment to ensure its integrity before going live.

Execute and monitor the migration

When all your plans are in place, it’s time to begin the migration process. Start with code refactoring, if necessary, and then move on to application rehosting or modernization. As the applications are migrated, move your data to the cloud and thoroughly test the functionality of the applications. Monitor the migration process closely, keeping an eye out for any errors or bugs that may arise. It may be wise to perform the migration in phases rather than all at once to minimize potential risks.

Conduct post-migration testing and optimization

The work doesn’t end once your application has been migrated; post-migration testing is crucial to confirm the success of the transition. Assess the data integrity, performance, functionality, and security of your new cloud environment and adjust configurations where necessary. You should also provide detailed walkthroughs and practical training on how to use the new system to increase user adoption and optimize the user experience.

There’s a lot that goes into migrating legacy applications to the cloud, but cloud professionals can make the process so much easier. Contact us to seamlessly move your legacy applications to the cloud. We’ll handle all the technical details so you don’t have to.

Your camera roll says a lot about your year, and Google Photos is now helping you revisit those memories (for better or worse).

  • Security researchers found two new malware variants, an infostealer and a loader
  • The developers seem to be the same group that's behind more_eggs
  • The infostealer can grab passwords, cookies, and more

Venom Spider, a threat actor behind the infamous More_eggs malware, is expanding its malware-as-a-service (MaaS) operation. This is according to a new report from cybersecurity researchers Zscaler ThreatLabz, who recently found two new malware families linked to the same developer.

In a detailed report published earlier this week, the researchers said that Venom Spider (also known as Golden Chickens) built an infostealer called RevC2, and a loader named Venom Loader.

The infostealer can grab people’s login credentials, and cookies from Chromium-powered browsers (Chrome, Edge, Brave, and others). It can run shell commands, grab screenshots, and proxy traffic using SOCKS5. Finally, it can run commands as a different user, as well. The loader, on the other hand, is customized for each victim, and uses their computer’s name to encode the payload, it was said.

VenomLNK

The researchers first observed the new malware being used in August this year, and have been tracking it ever since. They don’t know exactly how the malware is distributed to the victims, but suspect it all starts with VenomLNK. This is an initial access tool that the researchers observed being used to deploy both of the above-mentioned malware, while at the same time, showing a decoy PNG image to the victim.

This is not the first time VenomLNK was seen in the wild, as the experts said it was used to deploy More_eggs lite before.

More_eggs is a JavaScript-based loader used to infiltrate systems by downloading and executing additional malicious payloads, typically after gaining an initial foothold through phishing emails or malicious links.

The malware is notorious for its stealthy behavior, as it leverages legitimate processes and tools to evade detection. Attackers often deploy more_eggs to install ransomware, steal sensitive data, or provide remote access to compromised systems.

More_eggs has been around for at least three years, possibly for longer.

Via The Hacker News

You might also like


  • Google Photos is rolling out a new Recap feature today
  • Recap videos give you a Spotify Wrapped-style summary of your year
  • The feature includes highlights and stats like your longest photo streak

If you're a big Google Photos fan, the service knows a lot about your life – and from today it's crunching all of that data together to make a new Spotify Wrapped-style highlights video of your year called Recap.

Rolling out from today in the Google Photos app, Recap goes a bit further than the Memories feature it's based on. There are the usual photo and video highlights, but like Wrapped you get stats based on your photos – like your longest photo streak and the top colors you photographed in 2024.

Recap can also reveal who you smiled the most with this year, but to power all of this you need to have Google Photos' 'Face Groups' setting turned on. To check that, inside the app go to your Account profile photo or initials in the top-right, then Settings > Privacy > Face Groups.

The Recap feature takes the form of a short video that sits in your Memories carousel and you'll get a notification in the Google Photos app when yours is ready. Google says it'll sit in the carousel throughout December so it's handy for sharing over the holidays, but you can also share it to social media from the app.

Google also says that "select users" in the US can also opt in to receive a version of their Recap video with personalized captions that are generated by Google Gemini. To do that, you'll need to head into the app's Settings and opt into using Gemini features.

How much do you want Google Photos to know?

Two phones on an orange background showing the Google Photos Recap highlights reel

(Image credit: Google)

New Google Photos features like Recap and 'Ask Photos' could divide opinion among its users. For some, they'll be fun, time-saving tools that save them from what were once huge photo book projects. But others could find its all-knowing analysis of their photo libraries a little creepy and invasive.

Recap goes a step beyond being a highlights reel with stats like the number of smiles you captured, who you smiled with the most and your favorite colors of 2024 (above).

That's all pretty innocent, but there's also a danger that those who haven't delved into Google Photos Memories settings like 'hide a face' could get a Recap video that treads on sensitive ground. A Google spokesperson told us: "Not all memories are worth revisiting, so we use filters and do our best to avoid resurfacing upsetting memories. However, there may be times where we don’t get it right."

"That’s why Google Photos includes controls to hide photos of certain people or time periods within Memories. Hidden people and dates will not appear in your Recap," the spokesperson reassured us.

Gemini-powered features like 'Ask Photos' (and the personalized Recap captions available to "select users" in the US) are also opt-in, so Google recognizes there is a sensitivity to its most powerful AI features being applied to Google Photos.

The tech giant's privacy hub for Google Photos says that your personal data in Photos is "never used for ads" and that your 'Ask Photos' responses "aren't reviewed by humans", but with cloud photo libraries becoming increasingly smart it's worth deciding how much you want the services to know about your life.

You might also like

Want to manage your finances in a dedicated money app? These open-source options are some of the most popular - and for good reason.

  • A yearly certification should be mandatory for US telcos, FCC Chair said
  • The initiative should help businesses tackle rising attacks
  • China denies any involvement

It should be mandatory for American telecommunications organizations to every year submit a certification, confirming they have a solid cyber-incident response plan set up.

This is a proposal set forth by US Federal Communications Commission Chairwoman Jessica Rosenworcel, in response to recent news that Chinese state-sponsored threat groups have entrenched themselves deeply into US telecom providers, possibly snooping in on important communications for years.

Earlier this year, multiple cybersecurity organizations, and then government agencies too, reported that Chinese threat actors named Salt Typhoon infiltrated some US telecommunications giants and were pulling valuable data.

Immediate effect

Later, a number of organizations confirmed the findings, including T-Mobile, Verizon, Lumen Technologies, and AT&T. The campaign seems to be global, affecting “dozens” of private and public sector firms around the world.

"While the Commission's counterparts in the intelligence community are determining the scope and impact of the Salt Typhoon attack, we need to put in place a modern framework to help companies secure their networks and better prevent and respond to cyberattacks in the future," Rosenworcel said in a statement.

Reuters cited Rosenworcel saying the proposal was being circulated to other commissioners in her agency. If adopted, it would take effect immediately, it was added.

The victims are now working diligently on ousting the spies in an ongoing effort, with no concrete deadline set up.

At the same time, the Chinese government remains silent. In the past, it has denied these allegations on numerous occasions, even accusing the US of being the world’s cyber-bully at one point. A few months ago, it released a report in which it claimed that Volt Typhoon, another hacking collective, was actually a CIA asset.

The document asserts that China consulted over 50 cybersecurity experts, who collectively determined both the US and Microsoft do not have enough evidence to implicate China’s involvement with Volt Typhoon. However, the names of the experts are not included in the document.

Via Reuters

You might also like