SkyWatchMesh – UAP Intelligence Network

UAP Intelligence Network – Real-time monitoring of official UAP reports from government agencies and scientific institutions worldwide

Category: UAP Intelligence

  • Network jobs watch: Hiring, skills and certification trends

    Network jobs watch: Hiring, skills and certification trends

    Network and infrastructure roles continue to shift as enterprises adopt technologies such as AI-driven network operations, multicloud networking, zero trust network access (ZTNA), and SD-WAN. Here’s a recap of some of the latest industry research, hiring statistics, and certification trends that impact today’s network professionals, infrastructure and operations (I&O) leaders, and data center teams. Check back for regular updates.

    AI certifications garner higher pay

    Pay boosts for IT skills have been in steady decline for three years—but AI certifications are contradicting the trend, according to new data from Foote Partners. In Q3 2025, both certified and non-certified IT skills saw a 0.7% dip in cash pay premiums. But AI-related certifications managed to rise nearly 12% year over year, while pay for non-certified AI skills dipped 1%. The research tracks 124 AI-related skills and certifications, with pay premiums ranging between 7% and 23% of base salary. Interestingly, non-certified AI skills still tend to pull in higher bonuses—averaging 14.5% of base pay versus 8.3% for certified AI roles. Beyond AI, top-paying domains include DevSecOps, SRE, DataOps, blockchain, and security architecture.

    October 2025

    AI and automation reshape IT job market

    Artificial intelligence and automation are driving substantial job reductions across the IT job market, according to new research from Janco Associates. The management consulting firm reports that the telecommunications industry has lost 59% of its positions to automation and AI, marking a trend that is now spreading to other IT disciplines. Janco Associates reports that some industry leaders are predicting that about 18% to 22% of the workforce will be eliminated due to automation and AI in the next 5 years. And many CIOs and CTOs expect to cut existing positions for IT professionals in the next three years. More than half of CIOs and CFOs surveyed by Janco Associates indicated that they would prefer AI-enabled developers with two to three years of experience over traditional IT professionals with five years of experience and no AI skills. Among the roles expected to be reduced due to AI and automation are support roles, help desks, non-automated testing, code generation, and legacy infrastructure administration. Still, Janco Associates reports that certain areas continue to see growth. For instance, positions in AI development and deployment, big data analytics, cybersecurity, and compliance continue to grow.

    September 2025

    AI job listings spike 94%

    According to new data from CompTIA, active employer job listings requiring artificial intelligence (AI) skills surged by 94% in August compared to the same month last year. Tech occupation employment data includes employers across all industry sectors, and the latest research from the U.S. Bureau of Labor Statistics (BLS), analyzed by CompTIA, revealed that tech employment increased by an estimated net new 247,000 workers in August. In August, the tech worker unemployment rate edged up to 3%, while employer job postings for tech positions declined 2.6% from July’s numbers. “Hiring intent data continues to show employers pursuing tech talent across a range of disciplines, from AI and data science to tech support and cloud engineering,” said Tim Herbert, chief research officer at CompTIA, in a statement. The full CompTIA Tech Jobs Report is available here.

    September 2025

    Cisco introduces wireless cert track

    Cisco is introducing a new wireless-only certification track featuring two credentials: CCNP Wireless and CCIE Wireless. CCNP Wireless is designed to validate advanced skills in designing, implementing, and managing modern Wi-Fi infrastructures, covering topics such as RF fundamentals, 802.11 standards (including Wi-Fi 6 and 7), client connectivity, monitoring, automation, AI, FlexConnect, QoS, multicast, security, and location services. Candidates must pass a core exam plus one concentration exam. CCIE Wireless targets expert‐level mastery in enterprise wireless, encompassing design, deployment, optimization, and operations across Cisco and platform-agnostic technologies (including Meraki and Wi-Fi 6/7). Candidates are expected to have roughly five to seven years of experience. The certification requires both a written core exam and a hands-on lab exam. The first exam availability is set for March 19, 2026, and Cisco is providing preparatory resources via its learning network. Read the full story here.

    August 2025

    CompTIA launches AI prompting cert

    CompTIA has released AI Prompting Essentials, a certification aimed at helping professionals gain practical skills for using artificial intelligence tools in the workplace. The program teaches professionals how to identify tasks suited for AI, craft effective prompts, refine responses, and automate routine processes to improve productivity and collaboration.The course takes six to eight hours to complete and includes guidance on using AI ethically and securely in common workplace projects. Priced at $99, it is designed for beginners and requires no prior experience. AI Prompting Essentials complements CompTIA’s AI Essentials, a six-hour course that provides a foundational understanding of AI concepts. While AI Essentials introduces the basics, AI Prompting Essentials offers more hands-on skills for working directly with AI tools. Read the full story here.

    August 2025

    Report: AI fuels job growth

    Despite high-profile tech layoffs and concerns about AI replacing jobs, the Linux Foundation’s 2025 State of Tech Talent report reveals that AI is fueling job growth. Based on responses from more than 500 global hiring leaders, the study found that 2.7× more organizations expanded their workforce than reduced headcount due to AI. For AI-specific roles, 57% of organizations are increasing staff in AI/ML engineering operations, and 3% are cutting positions in those areas. Entry-level technology jobs experienced net growth, with 24% of organizations reporting increased hiring, compared to 18% reporting reductions, while the remainder saw no impact. Yet, significant skills shortages persist. 68% of respondents say they’re understaffed in AI/ML operations and engineering. Other deficit areas include cybersecurity and compliance (65%), FinOps/cost optimization (61%), cloud computing (59%), and platform engineering (56%). Read the full story here.

    July 2025

    CompTIA expands Linux+ cert

    CompTIA has updated its Linux+ certification exam to include new and expanded content on artificial intelligence, automation, cybersecurity, DevOps, infrastructure as code (IaC), scalability, and systems troubleshooting. CompTIA Linux+ V8 is a vendor-neutral certification that validates the skills to automate administration tasks and streamline operations with shell scripting, Python, and configuration management tools. It also tests IT pros’ knowledge to deploy, maintain, and monitor containers and virtual machines. Read the full story here.

    July 2025

    IT salary growth slows, tech execs at large companies earn most

    Base salaries for all IT pros grew by 2.35% to reach $100,946, according to Janco Associates’ 2025 Mid-Year IT Salary Survey.

    Including bonuses and other benefits, IT pros saw their total compensation increase by 2.26% in large enterprises and 1.95% in midsized companies, the survey found. Salaries in midsized organizations are rising at a slower rate than in their large enterprise counterparts. Total compensation for IT executives in large enterprises is 13% higher than in midsized enterprises. IT professionals are paid $5,119 less on average in SMBs versus their counterparts in large enterprises.

    “IT executives in large firms have annual compensation of just over $188,000. At the same time, individuals with comparable roles in midsized enterprises are paid around $163,000. That is almost $25K less,” the report reads.

    July 2025

    Tech unemployment rate drops

    Tech unemployment in June was 2.8%, compared to 3.4% in May, according to data and analysis from CompTIA’s recently released Tech Jobs Report. Companies listed 455,341 job listings for tech positions in June, and tech occupation employment increased by an estimated net new 90,000 workers for the month, according to CompTIA’s analysis of U.S. Bureau of Labor Statistics (BLS) data. Of the positions listed in June, 211,924 or 47% were newly added last month, with the most in-demand roles including software developers, systems engineers, tech support, cybersecurity professionals, and network architects.

    “Tech employment showed surprising strength for the month given recent expectations,” said Tim Herbert, chief research officer at CompTIA, in a statement. Read the full story here.

    July 2025

    U.S. tech workforce sees slight growth

    The U.S. tech workforce continued to expand in 2024, adding 72,500 net new workers and signaling 1.2% job growth across core technology sectors, according to a new report from CompTIA.

    CompTIA’s State of the Tech Workforce 2025 report found that occupations such as data, cybersecurity, infrastructure, and tech enablement are growing. AI-related positions also increased, with software and web development roles seeing slower growth. CompTIA analyzes data provided by the U.S. Bureau of Labor Statistics. Looking ahead, the CompTIA report projects the U.S. tech occupation workforce will reach 6.1 million in 2025 based on available information, while the workforce reached slightly over 5.9 million in 2024. Read the full report here.

    July 2025

    IT unemployment rate greater than overall U.S. rate

    IT employment continues to lag behind the national rate, according to a new report from Janco Associates. The unemployment rate for IT professionals grew from 4.6% and 5.5%, marking the fifth consecutive month that it is higher than the U.S. national unemployment rate.

    “There continues to be uncertainty in the outlook for new IT job creation. For five consecutive months, the IT unemployment rate has been greater than the national unemployment rate. In the same five months, there was an increase in the number of unfilled jobs,” said M. Victor Janulaitis, CEO of Janco Associates.

    The information section—which encompasses tech jobs like telecommunications and software—added just 2,000 jobs in May, which represent a modest gain companies to industries such as healthcare and leisure and hospitality, according to the monthly U.S. Bureau of Labor Statistics (BLS) Current Employment Statistics (CES).

    According to Janco, most of the unfilled positions are for roles associated with artificial intelligence (AI), large language models (LLM), blockchain, and omnicommerce. The analyst firm noted that hiring and job growth continue to be in small and midsized enterprises, while larger firms are focusing on applying AI to improve efficiency and productivity.

    “Many of the larger firms continue to be focused on improvements in productivity and replacing lower-level skills with AI applications. AI continues to halt the growth of entry-level positions within IT, especially in customer service, internal reporting, telecommunications, and hosting automation,” Janco’s report states.

    June 2025

    CompTIA plans OT security certification

    IT training and certification provider CompTIA has begun to develop a new certification focused on cybersecurity skills in operational technology (OT). The SecOT+ certification will provide OT professionals with the skills to manage, mitigate, and remediate security risks in manufacturing and critical infrastructure environments. The certification program will provide OT positions, such as floor technicians and industrial engineers, as well as cybersecurity engineers and network architects on the IT side, with a common skills set.

    While developing the certification program, CompTIA will focus on the SecOT+ certification on:

    • Risk assessment-driven approaches to cybersecurity
    • Compliance and regulatory frameworks for OT
    • Hardening techniques and secure configurations
    • Managing third-party risks and supply chain security
    • Integrating and securing legacy systems

    CompTIA expects SecOT+ to be available in the second half of 2026. Read the full article here.

    May 2025

    10 network observability certs to check out

    Network observability platforms collect and analyze all kinds of telemetry data—such as logs, metrics, events, and traces—to provide insights into the health and performance of applications, services, and infrastructure. A growing number of vendors are offering network observability certifications to help IT professionals better master observability platforms and validate their skills. The certifications focus on areas such as telemetry, traffic analysis, performance monitoring, and anomaly detection. Cisco (including its Splunk and ThousandEyes divisions), Cribl, Datadog, Dynatrace, Elastic, LogicMonitor New Relic, and SolarWinds are among the vendors with network observability certs. Read the full article here.

    April 2025

    CompTIA updates A+ cert

    CompTIA has updated its vendor-agnostic A+ certification designed for entry-level tech support personnel. The A+ certification exam now include:

    • Artificial intelligence concepts, such as appropriate use, limitations to consider, and understanding the differences between private and public data.
    • Cybersecurity measures, protocols, and tools for malware detection and prevention.
    • Cloud computing models and virtualization concepts.
    • Software troubleshooting skills to diagnose and resolve software issues.
    • Understanding networking solutions, including VPN and secure remote access, and the ability to connect, configure, and troubleshoot a variety of network devices.

    CompTIA recommends professionals have between nine and 12 months of experience before taking the certification exam, which has two parts. CompTIA A+ 220-1201 covers mobile devices, networking technology, hardware, virtualization, and cloud computing. CompTIA A+ 220-1202 covers operating systems, security, software, and operational procedures.

    “As organizations increase their reliance on technology in all aspects of their operations, tech support teams have to be savvy and knowledgeable in many areas,” Katie Hoenicke, senior vice president, product development at CompTIA, said in a statement. “IT pros who achieve CompTIA A+ certification have demonstrated they are capable of working in this fast-changing environment.”

    April 2025

    Companies struggle to retain tech talent as IT pros switch jobs

    A recent ISACA study found that nearly three-fourths (74%) of companies surveyed are concerned about retaining technology talent. The same study also found that one in three IT professionals switched jobs in the past two years.

    The global ISACA Tech Workplace and Culture study surveyed 7,726 technology professionals in the fourth quarter of 2024 to learn more about career satisfaction, compensation, and more. The study found that a majority (79%) of IT pros experience stress on the job, and respondents identified the main work-related stressors as:

    • Heavy workloads: 54%
    • Long hours: 43%
    • Tight deadlines: 41%
    • Lack of resources: 41%
    • Unsupportive management: 41%

    Survey respondents also cited the top reasons for leaving a job as the following:

    • Desire for higher compensation
    • Improve career prospects
    • Want more interesting work

    “A robust and engaged tech workforce is essential to keeping enterprises operating at the highest level,” said Julia Kanouse, Chief Membership Officer at ISACA, in a statement. “In better understanding IT professionals’ motivations and pain points, including how these may differ across demographics, organizations can strengthen the resources and support these employees need to be effective and thrive, making strides in improving retention along the way.”

    March 2025

    Network pros: Upskill in AI and automation

    Networking skills must advance alongside emerging technologies, according to industry watchers. Networking professionals should get training around artificial intelligence and automation to design, build, and manage the networks businesses need to succeed today. Networking pros must have the skills to enable the integration of new AI applications with the underlying AI infrastructure and enable AI to assist with networking tasks.

    “Networking roles are undergoing significant evolution, with a key emphasis on the integration of emerging technologies such as AI and automation,” says Joost Heins, head of intelligence at Randstad Enterprise, a global talent solutions provider.

    By developing skills in networking monitoring, performance management, and cost optimization through automation and AI-powered tools, networking pros can become more adept at troubleshooting while offloading repetitive tasks such as copy-pasting configurations. Over time, they can gain the skills to better understand which behaviors and patterns to automate.

    Read the full story here.

    February 2025

    CompTIA launches CloudNetX certification

    The vendor-neutral CompTIA CloudNetX certification is now available, targeted at senior-level tech pros who want to validate that they’ve got the skills to design, engineer, and integrate networking solutions from multiple vendors in hybrid cloud environments. Professionals should have a minimum of ten years of experience in the IT field and five years of experience in a network architect role, with specific experience in a hybrid cloud environment, CompTIA recommends.

    “The demand for highly skilled network architects has surged as organizations increasingly adopt hybrid cloud solutions,” said Katie Hoenicke, senior vice president, product development at CompTIA, in a statement. “For seasoned network professionals, CompTIA CloudNetX can help them enhance and validate the advanced skills needed to excel in these complex environments.”

    CompTIA says the exam covers:

    • Technologies such as container networking, software-defined cloud interconnect, and generative AI for automation and scripting.
    • Network security, including threats, vulnerabilities, and mitigations; identity and access management; and wireless security and appliance hardening; and zero-trust architecture.
    • Business requirements analysis to design and implement network solutions, ensuring candidates can align technical skills with organizational goals.

     Read more about CompTIA’s Xpert Series certifications here.

    February 2025

    Tech skills gap worries HR, IT leaders

    An overwhelming majority (84%) of HR and IT leaders surveyed by technology talent-as-a-service provider Revature reported that they are concerned with finding tech talent in the coming year. The survey polled some 230 HR and IT decision-makers, and more than three-quarters (77%) said that their team has been affected by the current IT skills gap. While 56% of respondents sad upskilling/reskilling is their strategy for closing the IT skills gaps, many reported ongoing challenges. Among the challenges survey respondents have experienced are:

    • Finding qualified talent with the necessary skills: 71%
    • IT staffing companies can’t deliver talent quickly: 57%
    • Upskilling/reskilling in-house talent: 53%
    • Learning Management Systems are ineffective: 30%
    • Overall cost of training and staffing: 23%

    When asked which technical skills are important, 29% of respondents pointed to artificial intelligence, generative AI and machine learning skills. And 75% of respondents believe they are highly prepared or prepared for the influx of new technologies such as genAI, with 63% believing genAI will positively impact training and 56% saying it will help with hiring and retention in 2025.

    February 2025

    CompTIA releases AI Essentials program

    CompTIA recently launched its AI Essentials program that promises to help professionals develop skills in AI fundamentals.

    The CompTIA Essentials program provides self-paced lessons with videos, activities, reflection questions, and assessments. The training will help professionals distinguish AI from other types of intelligence and computing and teach them how to communicate about AI effectively. Students will also learn how to create AI prompts and navigate the privacy and security concerns that AI technology presents.

    The program uses both realistic scenarios and practice activities to experience how AI is applied in real-world situations. According to CompTIA, topics covered in the training include: AI Unveiled; Generative AI Frontiers; Engineering Effective Prompts; Balancing Innovation and Privacy; and Future Trends and Innovations in AI.

    Available now, CompTIA AI Essentials costs $129 and includes a license that would be valid for 12 months. Read the full story here.

    January 2025

    Mixed bag for IT, tech jobs

    Industry watchers continue to keep close tabs on the IT workforce as some research shows 70,900 tech jobs were cut from the economy, while other organizations report that unemployment rates for technology workers has dropped to 2%, representing the lowest level in more than a year.

    Janco Associates reports that 48,600 jobs were lost in 2023 along with 22,300 positions eliminated in 2024, based on U.S. Bureau of Labor Statistics data. “In 2023 and 2024, there was a major re-alignment in the way things are done within the IT function. With all the new advances in technology, many jobs have been eliminated or automated out of existence,” said M. Victor Janulaitis, CEO of Janco.

    Separately, CompTIA recently reported that the tech unemployment rate dropped to 2%, while the national unemployment rate remained unchanged at 4.1% for December. CompTIA reported that the base of tech employment throughout the economy increased by a net new 7,000 positions, putting the total number of tech workers at about 6.5 million.

    January 2025

    CompTIA updates penetration testing cert

    CompTIA recently announced it had upgraded its PenTest+ certification program to educate professionals on cybersecurity penetration testing with training on artificial intelligence (AI), scanning and analysis, and vulnerability management, among other things.

    PenTest+ will help cybersecurity professionals demonstrate their competency of current trends, prove they are up-to-date on the latest trends, and show they can perform hands-on tasks. According to CompTIA, professionals completing the PenTest+ certification course will learn the following skills: engagement management, attacks and exploits, reconnaissance and enumeration, vulnerability discovery and analysis, and post-exploitation and lateral movement.

    The PenTest+ exam features a maximum of 90 performance-based and multiple-choice questions and runs 165 minutes. Testers must receive a score of 750 or higher to pass the certification test. CompTIA recommends professionals taking the certification course and exam also have Network+ and/or Security+ certifications or equivalent knowledge, and three to four years of experience in a penetration testing job role. Pricing for the exam has yet to be determined. Read the full story here.

    January 2025

    CompTIA launches SecurityX cert

    CompTIA this week made available its SecurityX certification, which it had announced as part of its Xpert Series of certifications. SecurityX is designed for IT professionals with multiple years of work experience as security architects and senior security engineers who want to validate their expert-level knowledge of business-critical technologies. The program will cover the technical knowledge and skills required to architect, engineer, integrate, and implement enterprise security solutions across complex environments. CompTIA expects to release another expert-level certification program, CompTIA CloudNetX, in the coming months. Read the full story here.

    December 2024

    CompTIA unveils starter courses for network, security certs

    The new CompTIA a+ Network and CompTIA a+ Cyber courses aim to provide newcomers with the knowledge they need to start a tech career in networking and security. The skills gained will help people to train for higher-level certifications, according to CompTIA. CompTIA a+ Network includes 31 hours of instruction and teaches individuals to set up and support networks, troubleshoot issues, and manage Linux and Windows systems. CompTIA a+ Cyber covers the skills to secure devices and home networks. The price for each course is $499. Read the full story here.

    A third new certification from CompTIA aims to teach newcomers tech foundations. CompTIA Tech+ is designed to provide a “spectrum of tech knowledge and hands-on skills” to students looking to ultimately work in tech-based roles, according to the provider. The Tech+ certification covers basic concepts from security and software development as well as information on emerging technologies such as artificial intelligence, robotics, and quantum computing. Specific details on the exam for the CompTIA Tech+ certification are not yet available. Read the full story here.

    December 2024

    AI helps drive IT job growth

    Artificial intelligence is driving growth for the IT workforce at some enterprises. Nearly half (48%) of organizations polled for the Motion Recruitment 2025 Tech Salary Guide said they plan to add workers due to an increase in AI investments, compared to 19% that said they would downsize in relation to the technology. AI is also credited with transforming existing roles, with 23% of organizations shifting existing staff positions into roles that directly address AI, according to Motion Recruitment.

    Also of note: The number of fully remote tech positions is decreasing as the average time spent in office has grown from 1.1 days per week to 3.4 days per week. Read the full story here.

    December 2024

    New OpenTelemetry certification

    A new certification program aims to validate the skills needed to use OpenTelemetry, which helps IT teams gain visibility across distributed systems of cloud-native and microservices-based applications. Created by the Cloud Native Computing Foundation (CNCF) and Linux Foundation, the OpenTelemetry Certified Associate (OTCA) certification is designed for application engineers, DevOps engineers, system reliability engineers, platform engineers, or IT professionals looking to increase their abilities to leverage telemetry data across distributed systems.

    Telemetry data is critical to observability technologies because it provides raw, detailed information about system behavior to provide insights beyond basic monitoring metrics. Telemetry data can also be used to enable proactive problem detection and resolution in distributed systems.

    OTCA includes 12 months to schedule and take the exam and two exam attempts, and the exam is priced at $250. Read the full story here.

    November 2024

    AI, cybersecurity top skill shortages for 2025

    IT leaders are planning investments for 2025, and they expect to be putting budget dollars toward technologies such as artificial intelligence (AI), machine learning (ML), cybersecurity, cloud, and more. Yet while investing in innovative technologies is part of future planning, IT decision-makers also expect to struggle to staff certain roles due to an ongoing tech skills shortage.

    According to a global Skillsoft survey of 5,100 global IT decision-makers, the most difficult technology areas to hire for included cybersecurity/information security (38%), cloud computing (22%), and AI/ML (20%), among several others. As new technologies emerge, IT leaders must take inventory of the skills they have in-house and this survey found that 19% of respondents believe there is a “high risk of organizational objectives not being met due to skills gaps.” Read the full story here.

    November 2024

    Tech employment remains flat in October

    Tech employment experienced little to no change in October, indicating that by year-end there will not be enough roles available for the number of unemployed technology professionals. The U.S. Bureau of Labor Statistics (BLS) monthly jobs report shows that the unemployment rate remained mostly unchanged, and separate analysis of the findings reveals that unemployment for technology professionals also remained flat.

    “The job market for IT Pros had a major shift losing an average of 4,983 jobs per month over the past 12 months,” said M. Victor Janulaitis, CEO of Janco Associates, in a statement. “According to the latest BLS data analyzed, there are now approximately 4.18 million jobs for IT Professionals in the US. Layoffs at big tech companies continued to hurt overall IT hiring. Large high-tech firms continue to lay off to have better bottom lines. Included in that group of companies that have recently announced new layoffs are Intel, Microsoft, and Google.”

    According to CompTIA’s analysis of the BLS data, technology roles increased by 70,000 in October to about 6.5 million workers, and CompTIA pointed to job posting data that showed broad-based hiring across software, cybersecurity, support, data, and infrastructure. Still, CompTIA reports that tech industry employment declined by more than 4,000 jobs in October.

    “Despite the higher than usual noise in this month’s labor market data, there are a number of positives to point to on the tech employment front. The data indicates employers continue a balanced approach to hiring across core tech job roles and innovation enabling roles,” said Tim Herbert, chief research officer at CompTIA, in a statement.

    November 2024

    Cloud certifications bring in big dollars

    Skillsoft’s most recent ranking of the highest-paid IT certifications shows that IT professionals with certs in AWS, Google, and Nutanix earn more on average in the U.S.—some more than $200,000. According to Skillsoft’s tally, the top five highest-paying certifications are:

    • CCNP Security: $168,159
    • AWS Certified Security – Specialty: $203,597
    • Google Cloud – Professional Cloud Architect: $190,204
    • Nutanix Certified Professional – Multicloud Infrastructure (NCP-MCI) v6.5: $175,409
    • CCSP – Certified Cloud Security Professional: $171,524

    “Overall, the IT job market is characterized by a significant imbalance between supply and demand, which continues to drive salaries higher. Our data suggests that tech professionals skilled in cloud computing, security, data privacy, and risk management, as well as able to handle complex, multi-faceted IT environments, will be well-positioned for success,” says Greg Fuller, vice president of Codecademy Enterprise. “This year’s list shows that cloud computing skills remain in high demand and can be quite lucrative for tech professionals.” Read the full story here.

    October 2024

    Cybersecurity skills shortage persists

    There are not enough cybersecurity workers to fill the current number of open roles in the U.S. or globally as an ever-increasing threat landscape demands more security professionals. Recent data from CyberSeek shows that 265,000 more cybersecurity workers would be needed to solve current staffing needs. In addition, ISC2 Research reports that 90% of organizations report having skill gaps within their security teams in areas that include AI/ML (34%), cloud computing security (30%), and zero trust implementation (27%). Read the full story here.  

    October 2024

    Women in IT report gender bias in the workplace

    A recent survey revealed that 71% of 327 full-time female IT respondents said they work longer hours in hopes of more quickly advancing their careers. In addition, 70% of respondents said men in IT were likely to advance their careers or receive promotions more quickly than women. Some 31% of those surveyed said they believe that men are promoted faster. And almost two-thirds said their workplaces are not doing enough to promote or achieve gender equality, according to Acronis.

    To help foster more gender diversity, survey respondents said they could benefit from training and other courses, including: master classes, learning courses, and workshops (63%); networking events (58%); and memberships in professional organizations (44%). On the employer side, respondents said they believe organizations can help foster more gender equality in the workplace by offering mentorship opportunities (51%), actively hiring more diverse candidates (49%), and ensuring pay equity (49%). Read the full story here.

    October 2024

    Tech unemployment decreases in September

    Technology occupation employment increased by 118,000 new positions in September, according to CompTIA’s analysis of recent data released by the U.S. Bureau of Labor Statistics (BLS). The job growth pushed the tech unemployment rate down to 2.5% and included 8,583 net new positions for the month.

    The CompTIA Tech Jobs Report shows that job postings for future tech hiring grew to more than 516,000 active postings, including 225,000 new listings added in September. The jobs that saw the largest growth in percentage points in September are tech support specialists and database administrators. New hiring was driven by cloud infrastructure, data processing and hosting, and tech services and customer software development sector, CompTIA concluded from the BLS data.

    “It was never really a question of if, but when employers were going to resume hiring,” Tim Herbert, chief research officer, CompTIA, said in a statement. “A broad mix of companies viewed recent economic developments as the green light to move forward in addressing their tech talent needs.”

    October 2024

    CompTIA bolsters Cloud+ certification

    CompTIA has updated its Cloud+ professional certification to include DevOps, combining software development know-how with network operations experience, and other areas of expertise such as troubleshooting common cloud management issues.

    The updated certification course will cover cloud architecture, design, and deployment; security; provisioning and configuring cloud resources; managing operations throughout the cloud environment life cycle; automation and virtualization; backup and recovery; high-availability; fundamental DevOps concepts; and cloud management. The program will also include expertise on technologies such as machine learning, artificial intelligence, and the Internet of Things, according to CompTIA.

    “Businesses need to ensure that their teams have the skills to manage cloud and hybrid environments,” said Teresa Sears, senior vice president of product management at CompTIA, said in a statement. “CompTIA Cloud+ gives team members the ability to manage complex migrations, oversee multi-cloud environments, secure data, and troubleshoot while maintaining cost-effective operations.”

    Technology professionals with CompTIA Cloud+ or CompTIA Network+ certifications can further their skills and validate their knowledge with the CompTIA CloudNetX certification, which is scheduled to be released early next year and is part of the CompTIA Xpert Series, CompTIA says.

    October 2024

    Pearson debuts genAI certification

    There’s a new genAI certification from Certiport, a Pearson VUE business. This week the provider unveiled its Generative AI Foundations certification, which is designed to equip professionals and students with the skills needed to work with genAI technologies. The certification will validate an individual’s knowledge in areas such as:

    • Understanding generative AI methods and models
    • Mastering the basics of prompt engineering and prompt refinement
    • Grasping the societal impact of AI, including recognizing bias and understanding privacy concerns

    The Generative AI Foundations certification is available now through Mindhub and Certiport as well as Pearson VUE’s online testing platform, OnVUE, and in test centers within the Certiport network.

    October 2024

    Mixed bag for network, system admin jobs

    Recent data from the U.S. Bureau of Labor Statistics (BLS) shows that while there will be growth for many IT positions between now and 2033, some network and computer systems administrator roles are expected to decline. The number of computer network architects will climb 13.4%, and computer network support specialists will see a 7.3% gain in jobs. Network and computer systems administrators will see a decline of 2.6%, however.

    Overall, the market segment that BLS calls “computer and mathematical occupations” is projected to grow 12.9% between 2023 and 2033, increasing by 699,000 jobs. That makes it the second fastest growing occupational group, behind healthcare support occupations (15.2%).

    Read the full story here: 10-year forecast shows growth in network architect jobs while sysadmin roles shrink

    September 2024

    IT employment ticks down in August

    IT employment ticked down .05% in August, resulting in the loss of 2,400 jobs, month-over-month, according to an analysis of the high-tech employment market by TechServe Alliance. On a yearly basis, the IT job market shrunk by .33% with a loss of 17,500 positions. On a more positive note, the staffing company noted that engineering positions saw a more than 1% increase in a year-over-year comparison, adding 29,800 jobs in the same period.

    “As the overall job market softened in August, IT employment continued to struggle to gain momentum,” said Mark Roberts, TechServe’s CEO, in a statement. “Throughout 2024, job growth in IT has been effectively flat after 23 consecutive months of job losses. I continue to see IT employment moving sideways until the fog of uncertainty lifts over the economy, the national election, and ongoing geopolitical turbulence.”

    September 2024

    Employee education holding back AI success

    Employee education and training around AI will become more and more critical as research reveals that a majority of employees do not know how to apply the technology to their jobs.

    According to Slingshot’s 2024 Digital Work Trends Report, 77% of employees reported that don’t feel they are completely trained or have adequate training on the AI tools offered to them by managers. And for the most part, managers agree with just 27% saying that they feel employees are completely trained on the AI tools provided to employees.

    The research, conducted in Q2 2024 by Dynata and based on 253 respondents, also noted that AI skills and quality data are significant barriers to AI success. Nearly two-thirds (64%) of all respondents noted that their organization doesn’t have AI experts on their team, which is preventing their employers from offering AI tools. Another 45% pointed to the quality of data within the organization as a top reason AI tools aren’t offered at work. A third reason that AI isn’t prevalent in some workplaces is that organizations don’t have the tech infrastructure in place to implement AI tools.

    “Data is top of mind for employees too when it comes to AI: 33% of employers say their company would be ready to support AI if their company’s data was combed through for accuracy, and 32% say they need more training around data and AI before their company is ready,” the report reads.

    September 2024

    U.S. labor market continues downward slide

    The U.S. Bureau of Labor Statistics (BLS) this week released its most recent employment data that shows the ratio of job openings per unemployed worker continues to steadily decline, indicating unemployment rates will continue to rise.

    According to BLS Job Openings and Labor Turnover Summary (JOLTS) data, the number of job openings hit 7.7 million on the last day of July, while the hires stood at 5.5 million and “separations” increased to 5.4 million. Separations (3.3 million) include quits, layoffs, and discharges (1.8 million) for the same timeframe. The most recent numbers hint at more bad news for unemployment in the country, according to industry watchers.

    “The labor market is no longer cooling down to its pre-pandemic temperature it’s dropped below,” an Indeed Hiring Lab report on the BLS data stated. “The labor market is past moderation and trending toward deterioration.”

    For IT professionals, the BLS data shows that jobs in high tech might grow slightly by 5,000 jobs in 2024, but that will not be enough growth to offset the number of unemployed IT workers—which Janco Associates estimates is about 145,000.

    “According to the latest BLS data analyzed, there are now approximately 4.18 million jobs for IT professionals in the US. Layoffs at big tech companies continued to hurt overall IT hiring. Large high-tech firms continue to lay off to have better bottom lines. Included in that group of companies that have recently announced new layoffs are Intel, Microsoft, and Google,” said M. Victor Janulaitis, CEO of Janco, in a statement. “At the same time, BLS data shows that around 81,000 IT pros were hired but that 147,000 were looking for work in June. Our analysis predicts the same will be the case for July and August.”

    September 2024

    CompTIA unveils data science certification program

    Technology pros seeking to validate their data science competencies can now prove their knowledge with CompTIA’s DataX certification program.

    Part of CompTIA’s recently launched Xpert Series, the DataX program is based on input from data scientists working in private and public sectors and focuses on the skills critical to a data scientist’s success, such as: mathematics and statistics; modeling, analysis, and outcomes; operations and processes; machine learning; and specialized applications of data science. The program is designed for data scientists with five or more years of experience, and it identifies knowledge gaps as well as provides learning content to get candidates current on expert-level topics.

    “Earning a CompTIA DataX certification is a reliable indicator of a professional’s commitment to excellence in the field of data science,” said Teresa Sears, senior vice president of product management, CompTIA, in a statement. “This program validates the advanced analytics skills that help organizations enhance efficiency, mitigate risks, and maximize the value of their data assets.”

    August 2024

    CompTIA partners to provide IT training and certifications across Africa

    CompTIA is partnering with Gebeya Inc. to provide access to CompTIA’s library of IT, networking, cybersecurity and cloud computing courses. The collaboration will allow Africans interested in technology to access IT training and certification classes via CompTIA.

    Gebeya, a Pan-African talent cloud technology provider, says its mission “is to close the digital skills gap and drive digital transformation across Africa.” Partnering with CompTIA will enable aspiring technology workers in Africa to bolster their skills. “Our strategic partnership with CompTIA allows us to integrate a comprehensive skilling module within the Gebeya Talent Cloud, enabling our customers and partners to offer unmatched access to world-class IT training and certifications to their talent communities,” said Amadou Daffe, Gebeya CEO, in a statement.

    CompTIA offers vendor-neutral IT certifications that cover the fundamentals of several IT functions. The organization says its library of courses can help individuals stay current with today’s in-demand technology skills as well as enhance technical competency worldwide.

    “We have a shared mission to close the digital skills gap in Africa,” said Benjamin Ndambuki, CompTIA’s territory development representative for Africa, in a statement. “With Gebeya’s extensive reach and local expertise and CompTIA’s globally recognized certifications, we are confident we can empower a new generation of African tech professionals to thrive in the digital economy.”

    August 2024

    U.S. job growth weaker than forecast, unemployment rate creeping upward  

    New data released from the U.S. Bureau of Labor Statistics (BLS) shows earlier estimates of job growth were miscalculated. The agency reported this week that there were 818,000 fewer jobs added in the 12 months ending in March 2024 than previously reported. This information coupled with reports from Indeed that the unemployment rate continues to slowly increase is raising recession fears.

    According to Indeed’s Hiring Lab, “on a three-month average basis, the unemployment rate has risen .55 percentage points since its low of 3.5% in January 2023.” The adjusted BLS numbers suggest weak hiring and a cooler market than previously projected, but Indeed says there are reasons for “cautious optimism” about the U.S. labor market. For instance, the amount of available job postings and growth in wages could continue to attract more workers to the labor force.

    “In addition to a relative abundance of job opportunities, another factor that may be drawing workers back to the labor force in greater numbers is persistently strong wage growth, which has slowed from recent highs but remains on par with pre-pandemic levels,” Indeed reported.

    August 2024

    Talent gap threatens US semiconductor industry

    The semiconductor industry could be facing a major labor shortage as industry growth has outpaced the availability of skilled workers in the US. A recent report by McKinsey & Company found that public and private investment in the semiconductor industry in the US will expand to more than $250 billion by 2032 and will bring more than 160,000 new job openings in engineering and technical support to the industry. This coupled with the steep decline of the US domestic semiconductor manufacturing workforce – which has dropped 43% from its peak employment levels in 2000 – means the industry will struggle to fill those jobs. At the current rate, the shortage of engineers and technicians could reach as high as 146,000 workers by 2029, according to the report.

    August 2024

    CompTIA wants to help build high-tech careers

    New career resources from CompTIA are designed to teach people about specific tech-related roles and empower them to tailor a career path that best aligns with their skills and experiences.

    “Too many people don’t know what it means to work in tech, so they’re scared, or they think the jobs are boring or are too hard,” said Todd Thibodeaux, president and CEO of CompTIA, in a statement. “We want to educate people about the dynamic employment opportunities available in tech; encourage them to know they can thrive in these jobs; and empower them with the knowledge and skills to succeed.”

    Among the new resources is CompTIA Career Explorer, which the nonprofit organization says will help professionals tailor a career path that aligns with their workstyles and lifestyles. With the tool, jobseekers can test drive “a day in the life of specific job roles and challenge themselves with real-time, true-to-life problem solving” related to the jobs.

    CompTIA Career+ will provide users with an immersive, interactive video experience that “showcases a day in the life of in-demand job roles,” according to CompTIA. This resource will feature up to 30 job roles, representing about 90% of all tech occupations.

    The organization announced the new resources at its CompTIA ChannelCon and Partner Summit conference. “We want people to associate CompTIA with the competencies and skills to work in technology,” Thibodeaux said.

    August 2024

    Where STEM jobs pay the most

    A new study conducted by Germany-based biotechnology provider Cytena shows that California provides the highest average salaries in the U.S. for those working in science, technology, engineering, and math (STEM) professions.

    Cytena analyzed salary data for more than 75 STEM jobs listed on company review website Glassdoor to determine which states in the U.S. paid the most for technology talent. California ranks first with an average salary of $124,937 across all the jobs in the study, which included positions ranging from medical professionals to mathematicians and data scientists to network and software engineers. Washington state placed a close second with the average annual salary falling just below $124,000, and New York landed in third place with an average annual salary of $114,437. Following the top three, Nevada, Maryland, Massachusetts, Idaho, Hawaii, Colorado, and Connecticut rounded out the top ten states in the U.S. that pay the highest salaries for STEM-related positions.

    July 2024

    SysAdmin Day 2024: Celebrate your systems administrators

    Friday, July 26 marks the 25th annual System Administrator Appreciation Day. Always celebrated on the last Friday in July, SysAdmin Day recognizes IT professionals who spend their days ensuring organizations and the infrastructure supporting them run smoothly. Some may say it is a thankless job, which is why Ted Kekatos created the day to honor the men and women working to install and configure hardware and software, manage networks and technology tools, help end users, and monitor the performance of the entire environment.

    Network and systems admins field complaint calls and solve incidents for end users, often without hearing how much they helped their colleagues. The unsung heroes of IT, sysadmins deserve this day of recognition — they might even deserve a gesture or gift to acknowledge all the long hours they work and how much they do behind the scenes.

    July 2024

    NetBrain launches network automation certification program

    NetBrain Technologies debuted its Network Automation Certification Program, which will recognize engineers with advanced network automation skills. The program will enable network engineers to validate their skills and communicate the skillsets to others, according to NetBrain. Initial exams for the program will be offered October 3 following the NetBrain Live Conference in Boston.

    NetBrain currently lists three network automation certifications on its website:

    • NetBrain Certified Automation Associate (NCAA): This certification demonstrates a mastery of the essentials of NetBrain Automation. Engineers with this certification can design, build, and implement automation that can be scaled networkwide to achieve an organization’s automation goals.
    • NetBrain Certified Automation Professional (NCAP): This certification validates network engineers as experts with proficiencies in network automation to enhance critical troubleshooting and diagnostic workflows across network operations, security, and IT infrastructures.
    • NetBrain Certified Automation Architect (NCAE): This certification distinguishes network engineers as network automation visionaries capable of shaping a corporate NetDevOps strategy from initial concept design and rollout through operation and enablement.

    July 2024

    Skillsoft develops genAI skills program with Microsoft

    Skillsoft announced it collaborated with Microsoft to develop its AI Skill Accelerator program, which will help organizations upskill their workforce to effectively use Microsoft AI technologies such as Copilot and Azure Open AI as well as generative AI technologies more broadly. The goal is to drive improved business productivity and innovation using genAI applications more effectively.

    “This collaboration with Microsoft is the first of many AI learning experiences we will deliver to help our customers and their talent—from everyday end users to business leaders to AI developers—acquire the skills and tools they need to succeed in the age of AI,” said Ron Hovsepian, executive chair at Skillsoft, in a statement. According to Skillsoft’s annual IT Skills and Salary report that surveyed 5,700 tech professionals worldwide, 43% of respondents say their team’s skills in AI need improvement.

    Skillsoft’s AI Skill Accelerator offers a blended learning experience, including on-demand courses, one-on-one and group coaching, live instructor-led training, and hands-on practice labs. According to Skillsoft, the program will enable customers to:

    • Assess the current state of AI-related technology and leadership skills across the workforce
    • Index skills to make data-driven decisions about where talent can drive strategic business outcomes with AI
    • Develop AI skills rapidly with emerging training methods powered by Microsoft’s Azure Open AI
    • Reassess existing talent and skills gaps through post-training benchmarks

    “Microsoft and Skillsoft have a long-standing relationship and share a common goal to enable AI transformation across every area of business,” said Jeana Jorgensen, corporate vice president of worldwide learning at Microsoft, in a statement. “This learning experience is designed to empower individuals and organizations to harness the full capabilities of generative AI, Microsoft Copilot, and Microsoft’s AI apps and services.”

    July 2024

    Tech industry adds jobs, IT unemployment increases

    Data from IT employment trackers shows that the technology industry added more than 7,500 new workers in June, while at the same time the overall unemployment rate for IT pros increased.

    According to CompTIA, the tech industry added some 7,540 new workers in June, which marks the biggest monthly increase so far this year. CompTIA’s analysis of U.S. Bureau of Labor Statistics (BLS) data also shows that the positive growth was offset by a loss of 22,000 tech occupations throughout the U.S. economy. “Despite pockets of growth, the recent data indicates a degree of downward pressure on tech employment,“ said Tim Herbert, chief research officer, CompTIA, in a statement. “A combination of factors, including AI FOMO, likely contributes to segments of employers taking a wait and see approach with tech hiring.”

    Separately, Janco Associates reported that the overall unemployment rate for IT pros in June grew to 5.9%, which is higher than the 4.1% U.S. national unemployment rate. Janco Associates also estimated that 7,700 jobs were added to the IT job market in May 2024. “The number of unemployed IT Pros rose from 129,000 to 147,000.  There still is a skills mismatch as positions continue to go unfilled as the available IT Pros do not have the requisite training and experience required. The BLS data shows that around 78,000 IT pros were hired but that 147,000 are looking for work,” Janco Associates reported.

    July 2024

    CompTIA Network+ cert gets an update

    CompTIA updated its Network+ certification to include more extensive coverage of modern network environments, factors related to physical network installations, and know-how to better secure and harden networks.

    Software-defined networking (SDN) and SD-WAN are covered in the updated Network+ exam, or N10-009. According to CompTIA, “the program introduces infrastructure as code (IaC), which is considered a transformative approach that leverages code for improved provisioning and support for computing infrastructure.”

    The updated Network+ certification program also now integrates zero-trust architecture and other forms of network fortification. Read more in the full story: CompTIA updates Network+ certification

    June 2024

    AWS adds two AI-focused certifications

    Amazon Web Services (AWS) launched two new certifications in artificial intelligence for IT professionals looking to boost their skills and land AI-related jobs. The additional know-how will help practitioners secure jobs that require emerging AI skills, which could offer a 47% higher salary in IT, according to an AWS study.

    AWS Certified AI Practitioner is a foundational program that validates knowledge of AI, machine learning (ML), and generative AI concepts and use cases, according to AWS. Candidates who are familiar with using AI/ML technologies on AWS and who complete a 120-minute, 85-question course will be able to sharpen their skills with fundamental concepts as well as use cases for AI, ML, and genAI. The exam will cover topics such as prompt engineering, responsible AI, security and compliance for AI systems, and more.

    AWS Certified Machine Learning Engineer—Associate is a 170-minute exam with 85 questions that validates technical ability to implement ML workloads in production and to operationalize them. Individuals with at least one year of experience using Amazon SageMaker and other ML engineering AWS services would be good candidates for this certification. The exam will cover topics such as data preparation for ML models, feature engineering, model training, security, and more.

    Registration for both new AWS certifications opens August 13.

    June 2024

    Cisco unveils AI-focused certification

    Cisco’s new AI certification aims to help prepare IT pros to design, provision and optimize networks and systems needed for demanding AI/ML workloads. Unveiled at its Cisco Live conference in Las Vegas, the Cisco Certified Design Expert (CCDE)-AI Infrastructure certification is a vendor-agnostic, expert-level certification. With it, tech professionals will be able to design network architectures optimized for AI workloads, and “they’ll be able to do this while incorporating the unique business requirements of AI, such as trade-offs for cost optimization and power, and the matching of computing power and cloud needs to measured carbon use,” wrote Par Merat, vice president of Cisco Learning and Certifications, in a blog post about the new cert.

    According to Cisco, the new CCDE-AI Infrastructure certification addresses topics including designing for GPU optimization as well as building high-performance generative AI network fabrics. Those seeking this certification will also learn about sustainability and compliance of networks that support AI. The skills will be needed across organizations, according to the Cisco AI Readiness Index, which found that 90% of organizations are investing to try to overcome AI skills gaps. Read more here: Cisco debuts CCDE-AI Infrastructure certification

    June 2024

    U.S. cybersecurity talent demand outpaces supply

    As businesses continue to seek cybersecurity talent, the current supply of skilled workers will not meet the demand in 2024, according to recent data from CyberSeek, a data analysis and aggregation tool powered by a collaboration among Lightcast, NICE, and CompTIA.

    There are only enough available workers to fill 85% of the current cybersecurity jobs throughout the U.S. economy, according to CyberSeek data, and more than 225,000 workers are needed to close the cybersecurity skills gap. The data also shows that job postings for all tech occupations declined by 37% between May 2023 and April 2024.

    “Although demand for cybersecurity jobs is beginning to normalize to pre-pandemic levels, the longstanding cyber talent gap persists,” said Will Markow, vice president of applied research at Lightcast, in a statement. “At the same time, new threats and technologies are causing cybersecurity skill requirements to evolve at a breakneck pace, forcing employers, educators, and individuals to proactively anticipate and prepare for an ever-changing cyber landscape.”

    Positions in the highest demand include network engineers, systems administrators, cybersecurity engineers, cybersecurity analysts, security engineers, systems engineers, information systems security officers, network administrators, information security analysts, and software engineers, according to the CyberSeek data.

    “Building a robust cybersecurity presence often requires changes in talent acquisition strategies and tactics,” said Hannah Johnson, senior vice president, tech talent programs, CompTIA, in a statement. “That can include upskilling less experienced cybersecurity professionals for more advanced roles, or hiring people who demonstrate subject matter expertise via professional certifications or other credentials.”

    June 2024

    Average salary for IT pros surpasses $100k

    Recent employment data shows that the median salary for IT professionals is now $100,399, with total compensation (including bonuses and fringe benefits) reaching $103,692. Management consulting firm Janco Associates, Inc. reported that IT salaries have risen by 3.28% in the past 12 months, even while the unemployment rate for IT workers hits 5%. Executives continue to see the biggest paychecks with total compensation packages increasing by 7.48% and median compensation reaching $184,354.

    “Salary compression” is another trend Janco Associates noted. This occurs when new hires are offered salaries at the higher end of the pay range for existing positions, often getting paid more than current employees in the same roles.

    Midsized enterprise companies are seeing more attrition than their large enterprise counterparts, while salaries in midsized companies are also rising faster than they are in large enterprises. Salary levels in midsized enterprises increased 5.46% versus 2.56% in larger enterprises, according to Janco Associates.

    May 2024

    AI, IT operations among the most in-demand IT skills

    New research and survey results from IDC show that a growing lack of in-demand IT skills could be negatively impacting businesses’ bottom lines.

    The IDC report, Enterprise Resilience: IT Skilling Strategies, 2024, reveals the most in-demand skills at enterprise organizations right now. Among the 811 respondents, artificial intelligence tops the list, cited by 45% of respondents, followed closely by IT operations (44%) and cloud solutions-architecture (36%). Other skills in demand right now include: API integration (33%), generative AI (32%), cloud solutions-data management/storage (32%), data analysis (30%), cybersecurity/data security (28%), IoT software development (28%), and IT service management (27%).

    Nearly two-thirds (63%) of the IT leaders at North American organizations said the lack of these skills has delayed digital transformation initiatives, most by an average of three to 10 months. Survey respondents detailed the negative impacts of lacking skills in their IT organizations:

    • Missed revenue goals: 62%
    • Product delays: 61%
    • Quality problems: 59%
    • Declining customer satisfaction: 59%
    • Lost revenue: 57%

    Considering these survey results, IDC predicts that by 2026, 90% of organizations worldwide will feel the pain of the IT skills crisis, potentially costing up to $5.5 trillion in delays, quality issues, and revenue loss. “Getting the right people with the right skills into the right roles has never been so difficult,” says Gina Smith, PhD, research director for IDC’s IT Skills for Digital Business practice, said in a statement. “As IT skills shortages widen and the arrival of new technology accelerates, enterprises must find creative ways to hire, train, upskill, and reskill their employees. A culture of learning is the single best way to get there.”

    May 2024

    Organizations abandon IT projects due to skills gap

    A lack of specific technology skills worries IT executives, who report they will not be able to adopt new technologies, maintain legacy systems, keep business opportunities, and retain clients if the skills gap persists.

    In a recent survey by online professional training provider Pluralsight, 96% of technologists said their workload has increased due to the skills gap, and 78% also reported that they abandoned projects partway through because they didn’t have employees with the necessary IT skills to successfully finish. While most organizations (78%) said their skills gap has improved since last year, survey respondents reported that cybersecurity, cloud, and software development are the top three areas in which a skills gap exists. IT executives surveyed said they worry the skills gap in their organizations will make it difficult to:

    • Adopt new technology: 57%
    • Maintain legacy systems: 53%
    • Keep business opportunities: 44%
    • Retain clients: 33%

    Pluralsight surveyed 1,400 executives and IT professionals across the U.S., U.K., and India to learn more about the technical skills gap and how organizations are addressing a lack of expertise in specific technology areas.

    May 2024

    Lack of skills stymies network automation efforts

    Network automation continues to challenge IT leaders, and one factor is a lack of skills on staff.

    When research firm Enterprise Management Associates surveyed 354 IT professionals about network automation, just 18% rated their network automation strategies as a complete success, and 54% said they have achieved partial success. The remaining 38% said they were uncertain of the level of success achieved or admitted failure with their network automation projects.

    More than one-fourth (26.8%) of the respondents pointed to staffing issues such as skills gaps and staff churn as a business challenge. “The most challenging thing for me is the lack of network engineers who can contribute to automation,” said a network engineer at a midmarket business services company in the EMA report. “The community is small, and it’s hard to find people who can help you solve a problem.”

    April 2024

    CompTIA plans AI certification roadmap

    IT certification and training group CompTIA is expanding its product and program roadmap to meet the growing demand for AI-related skill sets.

    AI becoming critical to existing job functions. At the same time, new roles are starting to land on employers’ radar. “Two entirely new job roles—prompt engineering and AI systems architects—are emerging. These positions align with the AI priorities of many organizations,” said Teresa Sears, vice president of product management at CompTIA.

    Millions of IT professionals will need to acquire new AI skills to meet the needs of the job market, said Thomas Reilly, CompTIA’s chief product officer, in a statement. “We intend to create a range of certifications and training offerings spanning the complete career arc, from foundational knowledge for pre-career and early career learners to advanced skills for professionals with years of workforce experience.”

    February 2024

    IT job growth flattened in 2023

    The number of new IT jobs created in calendar year 2023 flattened with just 700 positions added, which signals continued concerns about the economy and growing demand for skills focused on emerging technologies. For comparison, 2022 saw 267,000 jobs added, with industry watchers attributing the dramatic difference to tech layoffs and other cost-cutting measures.

    According to Janco Associates, despite companies adding some 21,300 jobs in the fourth quarter of 2023, the overall increase for the entire calendar year still comes to just 700 new positions. 

    “Based on our analysis, the IT job market and opportunities for IT professionals are poor at best. In the past 12 months, telecommunications lost 26,400 jobs, content providers lost 9,300 jobs, and other information services lost 10,300 jobs,” said M. Victor Janulaitis, CEO at Janco, in a statement. “Gainers in the same period were computer system designers gaining 32,300 jobs and hosting providers gaining 14,000.”

    January 2024

    Positive hiring plans for new year

    Robert Half reports that the job market will remain resilient heading into 2024. According to the talent solutions provider’s recent survey, more than half of U.S. companies plan to increase hiring in the first half of 2024. While the data is not limited to the IT sector, the research shows 57% plan to add new permanent positions in the first six months of the year while another 39% anticipate hiring for vacant positions and 67% will hire contract workers as a staffing strategy.

    Specific to the technology sector, 69% of the more than 1,850 hiring managers surveyed reported they would be adding new permanent roles for those professions. Still, challenges will persist into the new year, according to Robert Half, which reported 90% of hiring managers have difficulty finding skilled professionals and 58% said it takes longer to hire for open roles compared to a year ago.

    December 2023

    Cisco CCNA and AWS cloud networking rank among highest paying IT certifications

    Cloud expertise and security know-how remain critical in building today’s networks, and these skills pay top dollar, according to Skillsoft’s annual ranking of the most valuable IT certifications. At number one on its list of the 20 top-paying IT certifications is Google Cloud-Professional Cloud Architect with an average annual salary of $200,960.

    In addition to several cloud certifications, there are five security, networking, and system architect certifications on Skillsoft’s top 20 list:

    • ISACA Certified Information Security Manager (CISM): Average annual salaries for those with CISM certification is $167,396, a slight increase over last year’s 162,347 salary.
    • ISC2 Certification Information Systems Security Professional (CISSP): This certification consistently delivers an average annual salary of $156,699, according to Skillsoft.
    • ISACA Certified Information Systems Auditor (CISA): Professionals with a CISA certification earn an average annual salary of $154,500, an increase over last year’s $142,336.
    • AWS Certified Advanced Networking-Specialty: This certification commands an annual average salary of $153,031.
    • Cisco Certified Network Associate (CCNA): This certification commands an average annual salary of $128,651.

    November 2023


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → EHarmony

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Video Friday: Multimodal Humanoid Walks, Flies, Drives

    Video Friday: Multimodal Humanoid Walks, Flies, Drives

    Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

    IROS 2025: 19–25 October 2025, HANGZHOU, CHINA

    Enjoy today’s videos!

    Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute in Abu Dhabi, UAE, recently conducted a demonstration of X1, a multirobot system developed as part of a three-year collaboration between the two institutes. During the demo, M4, a multimodal robot developed by CAST, launches in drone-mode from a humanoid robot’s back. It lands and converts into driving mode and then back again, as needed. The demonstration underscored the kind of progress that is possible when engineers from multiple institutions at the forefront of autonomous systems and technologies truly collaborate.

    [ Caltech Center for Autonomous Systems and Technologies ]

    Spot robot performs dynamic whole-body manipulation using a combination of reinforcement learning and sampling-based control. Behavior shown in the video is fully autonomous, including the dynamic selection of contacts on the arm, legs, and body, and coordination between the manipulation and locomotion processes. The tire weighs 15 kg (33 lbs), making its mass and inertial energy significant compared to the weight of the robot. An external motion capture system was used to simplify perception and an external computer linked by WiFi performed the intensive computational operations.

    Spot’s arm is stronger than I thought. Also, the arm-foot collaboration is pretty wild.

    [ Robotics and AI Institute ]

    Figure 03 represents an unprecedented advancement in taking humanoid robots from experimental prototypes to deployable, scalable products. By uniting advanced perception and tactile intelligence with home-safe design and mass-manufacturing readiness, Figure has built a platform capable of learning, adapting, and working across both domestic and commercial settings. Designed for Helix, the home, and the world at scale, Figure 03 establishes the foundation for true general-purpose robotics – one capable of transforming how people live and work.

    The kid and the dog in those clips make me very, very nervous.

    [ Figure ]

    Researchers have invented a new super agile robot that can cleverly change shape thanks to amorphous characteristics akin to the popular Marvel anti-hero Venom. Researchers used a special material called electro-morphing gel (e-MG) which allows the robot to show shapeshifting functions, allowing them to bend, stretch, and move in ways that were previously difficult or impossible, through manipulation of electric fields from ultralightweight electrodes.

    [ University of Bristol ]

    This is very preliminary of course, but I love the idea of quadrupedal robots physically assisting each other to surmount obstacles like this.

    [ Robot Perception and Learning Lab ]

    Have we reached peak dynamic humanoid yet?

    [ Unitree ]

    Dynamic manipulation, such as robot tossing or throwing objects, has recently gained attention as a novel paradigm to speed up logistic operations. However, the focus has predominantly been on the object’s landing location, irrespective of its final orientation. In this work, we present a method enabling a robot to accurately “throw-flip” objects to a desired landing pose (position and orientation).

    [ LASA ]

    I don’t care all that much about “industry-oriented” quadrupeds. I do care very much about “rideable” quadrupeds.

    [ MagicLab ]

    I am not yet at the point where I would trust any humanoid around priceless ancient relics. Any humanoid, not just the robotic ones.

    [ LimX ]

    This CMU RI Seminar is from Matt Mason, Professor Emeritus at CMU, entitled “A Manipulation Journey.”

    The talk will revisit my career in manipulation research, focusing on projects that might offer some useful lessons for others. We will start with my beginnings at the MIT AI Lab and my MS thesis, which is still my most cited work, then continue with my arrival at CMU, a discussion with Allen Newell, an exercise to envision a coherent research program, and how that led to a second and third childhood. The talk will conclude with some discussion of lessons learned.

    [ Carnegie Mellon University Robotics Institute ]

    Dr. Christian Hubicki highlights and explains the past year of humanoid robotics research and news.

    [ Florida State University ]

    More excellent robotics discussions from ICRA@40.

    [ ICRA@40 ]


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → hotel-deals

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Nvidia: Latest news and insights

    Nvidia: Latest news and insights

    More processor coverage on Network World:
    Intel news and insights |
    AMD news and insights

    With its legacy of innovation in GPU technology, Nvidia has become a dominant force in the AI market.  Nvidia’s partners read like a technology who’s who list – e.g., AWS, Google Cloud, Microsoft Azure, Dell, HPE – and also crosses into vertical industries such as healthcare, finance, automotive, and manufacturing.

    From its gaming roots, Nvidia’s GPUs have evolved to power breakthroughs in scientific simulations, data analysis, and machine learning.

    Follow this page for the latest news, analysis, and features on Nvidia’s advancements and their impact on enterprise transformation.

    Nvidia news and analysis

    Nvidia, Infineon partner for AI data center power overhaul

    October 16, 2025: Infineon is teaming up with Nvidia to upgrade the outdated power architecture of AI data centers and replace them with a centralized high-voltage DC power setup.

    Nvidia’s DGX Spark desktop supercomputer is on sale now, but hard to find

    October 15, 2025: Nvidia’s “personal AI supercomputer,” the DGX Spark, may run fast but it’s been slow getting here. It finally went on sale today, five months later than the company initially promised, and early units are hard to find: 

    Inside Nvidia’s ‘grid-to-chip’ vision: How Vera Rubin and Spectrum-XGS advanceAI giga-factories

    October 13, 2025: Nvidia will be front-and-center at this week’s Global Summit for members of the Open Compute Project. The company is making announcements on several fronts, including the debut of Vera Rubin MGX, its next-gen architecture fusing CPUs and GPUs, and Spectrum-XGS Ethernet, a networking fabric designed for “giga-scale” AI factories.

    Nvidia and Fujitsu team for vertical industry AI projects

    October 6, 2025: Nvidia has partnered with Fujitsu to collaborate on vertical industry-specific artificial intelligence projects. The partnership will focus on co-developing and delivering an AI agent platform tailored for industry-specific agents in sectors such as healthcare, manufacturing, and robotics.

    Nvidia and OpenAI open $100B, 10 GW data center alliance

    September 23, 2025: OpenAI and Nvidia will create a strategic partnership to deploy at least 10 gigawatts of Nvidia systems for OpenAI’s next-generation AI infrastructure.The first phase is expected to come online in the second half of 2026 using Nvidia’s Vera Rubin CPU/GPU combination platform to train and run new models.

    Who wins/loses with the Intel-Nvidia union?

    September 22, 2025: Nvidia is dipping into its $56 billion bank account to acquire a 5% stake in Intel for $5 billion, making it the second largest shareholder of Intel stock after the federal government’s recent investment. The deal provides Nvidia greater access to the x86 ecosystem, important for the enterprise data center market, and provides Intel with access to GPUs that have demand and can move their CPU products as well.

    Nvidia reportedly acquires Enfabrica CEO and chip technology license

    September 19, 2025: Nvidia has hired away the CEO and other staff of chip interconnect maker Enfabrica, and licensed its core technologies in a deal worth over $900 million, Behind the move is demand for computing capacity to power generative AI for the likes of OpenAI, Anthropic, Mistral, AWS, Microsoft, and Google.

    September 18, 2025: Intel will collaborate with Nvidia to design CPUs with Nvidia’s NVLink high-speed chip interconnect. Nvidia and Intel also agreed to “jointly develop multiple generations of custom data center and PC products,” they said in a joint statement.

    China’s strike on Nvidia threatens global AI supply chains, sparking enterprise concerns

    September 16, 2025: China has accused Nvidia of breaching its anti-monopoly law, a move that could disrupt the chipmaker’s global operations and heighten risks for enterprises dependent on its GPUs as US-China trade tensions escalate.

    Nvidia rolls out new GPUs for AI inferencing, large workloads

    September 9, 2025: Nvidia has taken the wraps off a new purpose-built GPU along with a next-generation platform specifically targeted at massive-context processing as well as token software coding and generative video.       

    Cadence adds Nvidia to digital twin tool for data center design

    September 9, 2025: Cadence has updated to its Cadence Reality Digital Twin Platform library with the addition of digital twins for Nvidia’s DGX SuperPOD with DGX GB200 systems.

    Nvidia networking roadmap: Ethernet, InfiniBand, co-packaged optics will shape data center of the future

    September 4, 2025: Nvidia’s networking roadmap is based on data centers evolution into a new unit of computing, from a focus on CPUs to GPUs as the primary computing units and from the distribution of functions across different components to support the infrastructure for AI workload

    Nvidia’s new computer gives AI brains to robots

    August 25, 2025: Nvidia CEO Jensen Huang sees a future where billions of robots serve humans, bringing in trillions of dollars in revenue for the company. To meet that goal, Nvidia on Monday unveiled a new computing device that will go into high-performing robots that could then try to replicate human behavior.

    Nvidia turns to software to speed up its data center networking hardware for AI

    August 22, 2025: Nvidia wants to make long-haul GPU-to-GPU communication over Ethernet faster and more reliable, and hopes to achieve that with its new Spectrum-XGS algorithms, software protocols baked into Nvidia’s latest Ethernet gear. .

    Nvidia: ‘Graphics 3.0’ will drive physical AI productivity

    August 15, 2025: Nvidia has floated the idea of “Graphics 3.0” with the hope of making AI-generated graphics central to physical productivity. The concept revolves around graphics created by genAI tools. Nvidia say AI-generated graphics could help in training robots to do their jobs in the physical world or by helping AI assistants automate the creation of equipment and structures.

    Nvidia launches Blackwell-powered RTX Pro GPUs for compact AI workstations

    August 12, 2025: Nvidia announced two new professional GPUs, the RTX Pro 4000 Small Form Factor (SFF) and the RTX Pro 2000. Built on its Blackwell architecture, Nvidia’s new GPUs aim to deliver powerful AI capabilities in compact desktop and workstation deployments.

    Nvidia’s new genAI model helps robots think like humans

    August 11, 2025: Nvidia has developed a genAI model to help robots make human-like decisions by analyzing surrounding scenes. The Cosmos Reason model in robots can take in information from video and graphics input, analyze the data, and use its understanding to make decisions.

    Nvidia patches critical Triton server bugs that threaten AI model security

    August 5, 2025: A surprising attack chain in Nvidia’s Triton Inference Server, starting with a seemingly minor memory-name leak, could allow full remote server takeover without user authentication.

    China demands ‘security evidence’ from Nvidia over H20 chip backdoor fears

    August 4, 2025: China escalated pressure on Nvidia with the state-controlled People’s Daily publishing an opinion piece titled “Nvidia, how can I trust you?” — a day after regulators summoned company officials over alleged security vulnerabilities in H20 artificial intelligence chips.

    Nvidia to restart H20 exports to China, unveils new export-compliant GPU

    July 15, 2025: Nvidia will restart H20 AI chip sales to China and release a new GPU model compliant with export rules, a move that could impact global AI hardware strategies for enterprise IT teams. Nvidia has applied for US approval to resume sales and says that the government has indicated licenses will be granted and deliveries could begin soon.

    Nvidia GPUs are vulnerable to Rowhammer attacks

    July 15, 2025: Nvidia has issued a security reminder to application developers, computer manufacturers, and IT leaders that modern memory chips in graphic processors are potentially susceptible to so-called Rowhammer exploits after Canadian university researchers proved that an Nvidia A6000 GPU could be successfully compromised with a similar attack.

    Nvidia hits $4T market cap as AI, high-performance semiconductors hit stride

    July 11, 2025: Nvidia became the first publicly traded company to surpass a $4 trillion market capitalization value, 13 months after surpassing the $3 trillion mark. This makes Nvidia the world’s most valuable company ahead of Apple and Microsoft.

    New Nvidia technology provides instant answers to encyclopedic-length questions

    Jul 8, 2025: Have a question that needs to process an encyclopedia-length dataset? Nvidia says its new technique can answer it instantly. Built leveraging the company’s Blackwell processor’s capabilities, the new “Helix Parallelism” method allows AI agents to process millions of words — think encyclopedia-length — and support up to 32x more users at a time.

    Nvidia doubles down on GPUs as a service

    July 8, 2025: Nvidia’s recent initiative to dive deeper into the GPU-as-a-service (GPUaaS) model marks a significant and strategic shift that reflects an evolving landscape within the cloud computing market. 

    Nvidia, Perplexity to partner with EU and Middle East AI firms to build sovereign LLMs

    June 12, 2025: Nvidia and AI search firm Perplexity said they are joining hands with model builders and cloud providers across Europe and the Middle East to refine sovereign large-language models (LLMs) and accelerate enterprise AI uptake in local industries.

    Nvidia: ‘Sovereign AI’ will change digital work

    June 11, 2025: Nvidia executives think sovereign AI has the potential to change digital work as generative AI (genAI) aligns with national priorities and local regulations.

    AWS cuts prices of some EC2 Nvidia GPU-accelerated instances

    June 9, 2025: AWS has reduced the prices of some of its Nvidia GPU-accelerated instances to attract more AI workloads while competing with rivals, such as Microsoft and Google, as demand for GPUs and the cost of securing them continues to grow.

    Nvidia aims to bring AI to wireless

    June 6, 2025: Nvidia hopes to maximize RAN infrastructure use (traditional networks average a low 30% to 35%), use AI to rewrite the air interface, and enhance performance and efficiency through radio signal processing. The longer-term goal is to seamlessly process AI traffic at the network edge to create new monetization opportunities for service providers.

    Oracle to spend $40B on Nvidia chips for OpenAI data center in Texas

    May 26, 2025: Oracle is reportedly spending about $40 billion on Nvidia’s high-performance computer chips to power OpenAI’s new data center in Texas, marking a pivotal shift in the AI infrastructure landscape that has significant implications for enterprise IT strategies.

    Nvidia eyes China rebound with stripped-down AI chip tailored to export limits

    May 26, 2025: Nvidia plans to launch a lower-cost AI chip for China in June, aiming to protect market share under the US export controls and signal a broader shift toward affordable, segmented products that could impact global enterprise AI spending.

    Nvidia introduces ‘ridesharing for AI’ with DGX Cloud Lepton

    May 19, 2025: Nvidia introduced DGX Cloud Lepton, an AI-centric cloud software program that makes it easier for AI factories to rent out their hardware to developers who wish to access performant compute globally.

    May 19, 2025: Nvidia kicked off the Computex systems hardware tradeshow with the news it has opened the NVLink interconnect technology to the competition with the introduction of NVLink Fusion. NVLink is a high-speed interconnect born out of its Mellanox networking group which lets multiple GPUs in a system or rack share compute and memory resources, thus making many GPUs appear to the system as a single processor.

    AMD, Nvidia partner with Saudi startup to build multi-billion dollar AI service centers

    May 15, 2025: As part of the avalanche of business deals coming from President Trump’s Middle East tour, both AMD and Nvidia have struck multi-billion dollar deals with an emerging Saudi AI firm. The deals served as the coming out party for Humain, a state-backed artificial intelligence (AI) company that operates under the Kingdom’s Public Investment Fund (PIF) and is chaired by Crown Prince Mohammed bin Salman. 

    Nvidia, ServiceNow engineer open-source model to create AI agents

    May 6, 2025: Nvidia and ServiceNow have created an AI model that can help companies create learning AI agents to automate corporate workloads..The open-source Apriel model, available generally in the second quarter on HuggingFace, will help create AI agents that can make decisions around IT, human resources and customer-service functions.

    Nvidia AI supercluster targets agents, reasoning models on Oracle Cloud

    April 29, 2025: The move marks the first wave of liquid-cooled Nvidia GB200 NVL72 racks in OCI data centers, involving thousands of Nvidia Grace CPUs and Blackwell GPUs. 

    Nvidia says NeMo microservices now generally available

    April 23, 2025: Nvidia announced the general availability of neural module (NeMo) microservices, a modular platform for building and customizing gen AI models and AI agents.NeMo microservices integrate with partner platforms to provide features including prompt tuning, supervised fine-tuning, and knowledge retrieval tools.

    Nvidia expects ban on chip exports to China to cost $5.5B

    April 16, 2025: Nvidia now expects new US government restrictions on exports of its H20 chip to China will cost the company as much as $5.5 billion.

    Incomplete patching leaves Nvidia, Docker exposed to DOS attacks

    April 15, 2025: A critical race condition bug affecting the Nvidia Container Toolkit, which received a fix in September, might still be open to attacks owing to incomplete patching.

    Nvidia lays out plans to build AI supercomputers in the US

    April 14, 2025: There was mixed reaction from industry analysts over an announcement that Nvidia plans to produce AI supercomputers entirely in the US. The company said in a blog post that, together with its manufacturing partners, it has commissioned more than one million square feet (92,900 square meters) of manufacturing space to build and test Nvidia Blackwell chips in Arizona and AI supercomputers in Texas.

    Potential Nvidia chip shortage looms as Chinese customers rush to beat US sales ban

    April 2, 2025: The AI chip shortage could become even more dire as Chinese customers are purportedly looking to hoard Nvidia chips ahead of a proposed US sales ban. According to inside sources, Chinese companies including ByteDance, Alibaba Group, and Tencent Holdings have ordered at least $16 billion worth of Nvidia’s H20 server chips for running AI workloads in just the first three months of this year.

    Nvidia’s Blackwell raises the bar with new MLPerf Inference V5.0 results

    April 2, 2025: Nvidia released a set of MLPerf Inference V5.0 benchmark results for its Blackwell GPU, the successor to Hopper, saying that its GB200 NVL72 system, a rack-scale offering designed for AI reasoning, set a series of performance records.

    5 big takeaways from Nvidia GTC

    March 25, 2025: Now that the dust has settled from Nvidia’s GTC 2025, a few industry experts weighed in on some core big picture developments from the conference. Here are five of their top observations.

    Nvidia wants to be a one-stop enterprise technology shop

    March 24, 2025: After last week’s Nvidia GTC 2025 event, a new, fuller picture of the vendor emerged. Analysts agree that Nvidia is not just a graphics chip provider anymore. It’s a full-stack solution provider, and GPUs are just one of many parts.

    Nvidia launches AgentIQ toolkit to connect disparate AI agents

    March 21, 2025: As enterprises look to adopt agentic AI to boost the efficiency of their applications, Nvidia introduced a new open-source software library — AgentIQ toolkit — to help developers connect disparate agents and agent frameworks. The toolkit, according to Nvidia, packs in a variety of tools, including ones to weave in RAG, search, and conversational UI into agentic AI applications.

    Nvidia launches research center to accelerate quantum computing breakthrough

    March 21, 2025: In a move to help accelerate the timeline for practical, real-world quantum applications, Nvidia is establishing the Nvidia Accelerated Quantum Research Center. “Quantum computing will augment AI supercomputers to tackle some of the world’s most important problems,” Nvidia CEO Jensen Huang said.

    Nvidia, xAI and two energy giants join genAI infrastructure initiative

    March 19, 2025: An industry generative artificial intelligence (genAI) alliance, the AI Infrastructure Partnership (AIP), on Wednesday announced that xAI, Nvidia, GE Vernova, and NextEra Energy were joining BlackRock, Microsoft, and Global Infrastructure Partners as members.

    IBM broadens access to Nvidia technology for enterprise AI

    March 19, 2025: New collaborations between IBM and Nvidia have yielded a content-aware storage capability for IBM’s hybrid cloud infrastructure, expanded integration between watsonx and Nvidia NIM, and AI services from IBM Consulting that use Nvidia Blueprints.

    Nvidia’s silicon photonics switches bring better power efficiency to AI data centers

    March 19, 2025: Amid the flood of news from Nvidia’s annual GTC event, one item stood out. Nvidia introduced new silicon photonics network switches that integrate network optics into the switch using a technique called co-packaged optics (CPO), replacing traditional external pluggable transceivers. While Nvidia alluded to its new switches providing a cost savings, the primary benefit is to reduce power consumption with an improvement in network resiliency.

    What is Nvidia Dynamo and why it matters to enterprises?

    March 19, 2025: Chipmaker Nvidia has released a new open-source inferencing software — Dynamo, at its GTC 2025 conference, that will allow enterprises to increase throughput and reduce cost while using large language models on Nvidia GPUs.

    Nvidia, xAI and two energy giants join genAI infrastructure initiative

    March 19, 2025:  AI Infrastructure Partnership (AIP) announced that xAI, Nvidia, GE Vernova, and NextEra Energy joined the AIP. But given that no financial commitments or any other details were released, will it make a difference?

    HPE, Nvidia broaden AI infrastructure lineup

    March 19, 2025: HPE news from Nvidia GTC includes a new Private Cloud AI developer kit, Nvidia AI blueprints, GPU optimization capabilities, and servers built with Nvidia Blackwell Ultra and Blackwell architecture.

    Cisco, Nvidia team to deliver secure AI factory infrastructure

    March 18, 2025: Cisco and Nvidia have expanded their partnership to create their most advanced AI architecture package to date, designed to promote secure enterprise AI networking.

    Nvidia’s ‘hard pivot’ to AI reasoning bolsters Llama models for agentic AI

    March 18, 2025: The company has post-trained its new Llama Nemotron family of reasoning models to improve multistep math, coding, reasoning, and complex decision-making. The enhancements aim to provide developers and enterprises with a business-ready foundation for creating AI agents that can work independently or as part of connected teams.

    Nvidia details its GPU, CPU, and system roadmap for the next three years

    March 18, 2025: Nvidia CEO Jensen Huang shared previously unreleased specifications for its Rubin graphics processing unit (GPU), due in 2026, the Rubin Ultra coming in 2027, and announced the addition of a new GPU called Feynman to the mix for 2028.

    Oracle, Nvidia partner to add AI software into OCI services

    March 18, 2025: Nvidia’s AI Enterprise stack will be available natively through the OCI Console and will be available anywhere in OCI’s distributed cloud while providing enterprises access to over 160 AI tools for training and inference, including NIM microservices, the companies said in a joint statement at Nvidia’s annual GTC conference.

    Nvidia GTC 2025: What to expect from the AI leader

    March 3, 2025: Last year, Nvidia’s GTC 2024 grabbed headlines with the introduction of the Blackwell architecture and the DGX systems powered by it. With Nvidia GTC 2025 right around the corner, the tech world is eager to see what Nvidia – and its partners and competitors – will unveil next. 

    Cisco, Nvidia expand AI partnership to include Silicon One technology

    February 25, 2025; Cisco and Nvidia have expanded their collaboration to support enterprise AI implementations by tying Cisco’s Silicon One technology to Nvidia’s Ethernet networking platform. The extended agreement is designed to offer customers yet another way to support AI workloads across the data center and strengthens both companies’ strategies to expand the role of Ethernet networking for AI in the enterprise.

    Nvidia forges healthcare partnerships to advance AI-driven genomics, drug discovery

    February 14, 2025: Through new partnerships with industry leaders, Nvidia aims to advance practical use cases for AI in healthcare and life sciences. It’s a logical move: Healthcare has the most significant upside, particularly in patient care, among all the industries applicable to AI. 

    Nvidia partners with cybersecurity vendors for real-time monitoring

    February 12, 2025: Nvidia partnered with leading cybersecurity firms to provide real-time security protection using its accelerator and networking hardware in combination with its AI software. Under the agreement, Nvidia will provide integration of its BlueField and Morpheus hardware with cyber defenses software from Armis, Check Point Software Technologies, CrowdStrike, Deloitte and World Wide Technology .

    Nvidia claims near 50% boost in AI storage speed

    February 7, 2025: Nvidia is touting a near 50% improvement in storage read bandwidth thanks to intelligence in its Spectrum-X Ethernet networking equipment, according to the vendor’s technical blog post. Spectrum-X is a combination of the company’s Spectrum-4 Ethernet switch and BlueField-3 SuperNIC smart networking card, which supports RoCE v2 for remote direct memory access (RDMA) over Converged Ethernet.

    Nvidia unveils preview of DeepSeek-R1 NIM microservice

    February 3, 2025: The chipmaker stock plummeted 17% after Chinese AI developer DeepSeek unveiled its DeepSeek-R1 LLM. Last week, Nvidia announced the DeepSeek-R1 model is now available as a preview Nvidia inference microservice (NIM) on build.nvidia.com.

    Nvidia unveils preview of DeepSeek-R1 NIM microservice

    January 31, 2025: Nvidia stock plummeted 17% after Chinese AI developer, DeepSeek, unveiled its DeepSeek-R1 LLM. Later the same week, the chipmaker turned around and announced the DeepSeek-R1 model is available as a preview Nvidia inference microservice (NIM) on build.nvidia.com.

    Nvidia intros new guardrail microservices for agentic AI

    January 16, 2025: Nvidia added new Nvidia inference microservices (NIMs) for AI guardrails to its Nvidia NeMo Guardrails software tools. The new microservices aim to help enterprises improve accuracy, security, and control of agentic AI applications, addressing a key reservation IT leaders have about adopting the technology.

    Nvidia year in review

    January 10, 2025: Last year was Nvidia’s year. Its command of mindshare and market share was unequaled among tech vendors. Here’s a recap of some of the key Nvidia events of 2024 that highlight just how powerful the world’s most dominant chip player is.

    Nvidia launches blueprints to help jumpstart AI projects

    January 8, 2025: Nvidia recently issued designs for AI factories after hyping up the idea for several months. Now it has come out with AI blueprints, essentially prebuilt templates that give developers a jump start on creating AI systems.

    Nvidia’s Project DIGITS puts AI supercomputing chips on the desktop

    January 6, 2025: Nvidia is readying a tiny desktop device called Project DIGITS, a “personal AI supercomputer” with a lightweight version of the Grace Blackwell platform found in its most powerful servers; it’s aimed at data scientists, researchers, and students who will be able to prototype, tune, and run large genAI models.

    Nvidia unveils generative physical AI platform, agentic AI advances at CES

    January 6, 2025: At CES in Las Vegas, Nvidia trumpeted a slew of AI announcements, with an emphasis on generative physical AI that promises a new revolution in factory and warehouse automation. “AI requires us to build an entirely new computing stack to build AI factories, accelerated computing at data center scale,” Rev Lebaredian, vice president of omniverse and simulation technology at Nvidia.

    Verizon, Nvidia team up for enterprise AI networking

    December 30, 2024: Verizon and Nvidia partnered to build AI services for enterprises that run workloads over Verizon’s 5G private network. The new offering, 5G Private Network with Enterprise AI, will run a range of AI applications and workloads over Verizon’s private 5G network with Mobile Edge Compute (MEC). MEC is a colocated infrastructure that is a part of Verizon’s public wireless network, bringing compute and storage closer to devices and endpoints for ultra-low latency.

    Nvidia’s Run:ai acquisition waved through by EU

    December 20, 2024: Nvidia will face no objections to its plan to acquire Israeli AI orchestration software vendor Run:ai Labs in Europe, after the European Commission gave the deal its approval today. But Nvidia may not be out of the woods yet. Competition authorities in other markets are closely examining the company’s acquisition strategy.

    China launches anti-monopoly probe into Nvidia amid rising US-China chip tensions

    December 10, 2024: China has initiated an investigation into Nvidia over alleged violations of the country’s anti-monopoly laws, signaling a potential escalation in the ongoing tech and trade tensions between Beijing and Washington.

    Nvidia Blackwell chips face serious heating issues

    November 18, 2024: Nvidia’s next-generation Blackwell data center processors have significant problems with overheating when installed in high-capacity server racks, forcing redesigns of the racks themselves, according to a report by The Information. These issues have reportedly led to design changes, meaning delays in shipping product and raising concern that its biggest customers, including Google, Meta, and Microsoft, will be able to deploy Blackwell servers according to their schedules.

    Nvidia to power India’s AI factories with tens of thousands of AI chips

    October 24, 2024: Nvidia plans to deploy thousands of Hopper GPUs in India to create AI factories and collaborate with Reliance Industries to develop AI infrastructure.. Yotta Data Services, Tata Communications, E2E Networks, and Netweb will lead the AI factories — large-scale data centers for producing AI. Nvidia added that the expansion will provide nearly 180 exaflops of computing power.

    Nvidia contributes Blackwell rack design to Open Compute Project

    October 15, 2024: Nvidia contributed to the Open Compute Project its Blackwell GB200 NVL72 electro-mechanical designs – including the rack architecture, compute and switch tray mechanicals, liquid cooling and thermal environment specifications, and Nvidia NVLink cable cartridge volumetrics –.

    As global AI energy usage mounts, Nvidia claims efficiency gains of up to 100,000X

    October 08, 2024: As concerns over AI energy consumption ratchet up, chip maker Nvidia is defending what it calls a steadfast commitment to sustainability. The company reports that its GPUs have experienced a 2,000X reduction in energy use over the last 10 years in training and a 100,000X energy reduction over that same time in generating tokens.

    Accenture forms new Nvidia business group focused on agentic AI adoption

    October 4, 2024: Accenture and Nvidia announced an expanded partnership focused on helping customers rapidly scale AI adoption. Accenture said the new group will use Accenture’s AI Refinery platform — built on the Nvidia AI stack, including Nvidia AI Foundry, Nvidia AI Enterprise, and Nvidia Omniverse — to help clients create a foundation for use of agentic AI.

    IBM expands Nvidia GPU options for cloud customers

    October 1, 2024: IBM expanded access to Nvidia GPUs on IBM Cloud to help enterprise customers advance their AI implementations, including large language model (LLM) training. IBM Cloud users can now access Nvidia H100 Tensor Core GPU instances in virtual private cloud and managed Red Hat OpenShift environments.

    Oracle to offer 131,072 Nvidia Blackwell GPUs via its cloud

    September 12, 2024: Oracle started taking pre-orders for 131,072 Nvidia Blackwell GPUs in the cloud via its Oracle Cloud Infrastructure (OCI) Supercluster to aid large language model (LLM) training and other use cases, the company announced at the CloudWorld 2024 conference.  The launch of an offering that provides these many Blackwell GPUs, also known as Grace Blackwell (GB) 200, is significant as enterprises globally are faced with the unavailability of high-bandwidth memory (HBM) — a key component used in making GPUs.

    Why is the DOJ investigating Nvidia?

    September 11, 2024: After a stock sell-off following its quarterly earnings report, Nvidia’s pain was aggravated by news that the Department of Justice is escalating its investigation into the company for anticompetitive practices. According to a Bloomberg report, the DOJ sent a subpoena to Nvidia as part of a probe into alleged antitrust practices.

    Cisco, HPE, Dell announce support for Nvidia’s pretrained AI workflows

    September 4, 2024: Cisco, HPE, and Dell are using Nvidia’s new AI microservices blueprints to help enterprises streamline the deployment of generative AI applications. Nvidia’s announced its NIM Agent Blueprints, a catalogue of pretrained, customizable AI workflows that are designed to provide a jump-start for developers creating AI applications. NIM Agent Blueprints target a number of use cases, including customer service, virtual screening for computer-aided drug discovery, and a multimodal PDF data extraction workflow for retrieval-augmented generation (RAG) that can ingest vast quantities of data.

    Nvidia reportedly trained AI models on YouTube data

    August 4, 2024: Nvidia scraped huge amounts of data from YouTube to train its AI models, even though neither Youtube nor individual YouTube channels approved the move, according to leaked documents. Among other things, Nvidia reportedly used the YouTube data to train its deep learning model Cosmos, an algorithm for automated driving, a human-like AI avatar, and Omniverse, a tool for building 3D worlds.

    Can Intel’s new chips compete with Nvidia in the AI universe?

    June 9, 2024: Intel is aiming its next-generation X86 processors at AI tasks, even though the chips won’t actually run AI workloads themselves.mAt Computex, Intel announced its Xeon 6 processor line, talking up what it calls Efficient-cores (E-cores) that it said will deliver up to 4.2 times the performance of Xeon 5 processors. The first Xeon 6 CPU is the Sierra Forest version (6700 series) a more performance-oriented line, Granite Rapids with Performance cores (P-cores or 6900 series), will be released next quarter.

    Everyone but Nvidia joins forces for new AI interconnect

    May 30, 2024: A clear sign of Nvidia’s dominance is when Intel and AMD link arms to deliver a competing product. That’s what happened when AMD and Intel – along with Broadcom, Cisco, Google, Hewlett Packard Enterprise, Meta and Microsoft – formed the Ultra Accelerator Link (UALink) Promoter Group to develop high-speed interconnections between AI processors.

    Nvidia to build supercomputer for federal AI research

    May 15, 2024: The U.S. government will use an Nvidia DGX SuperPOD to provide researchers and developers access to much more computing power than they have had in the past to produce generative AI advances in areas such as climate science, healthcare and cybersecurity.

    Nvidia, Google Cloud team to boost AI startups

    April 11, 2024: Alphabet’s Google Cloud unveiled a slew of new products and services at Google Cloud Next 2024, among them a program to help startups and small businesses build generative AI applications and services. The initiative brings together the Nvidia Inception program for startups and the Google for Startups Cloud Program.

    Nvidia GTC 2024 wrap-up: Blackwell not the only big news

    March 29, 2024: Nvidia’s GDC is in our rearview mirror, and there was plenty of news beyond the major announcement of the Blackwell architecture and the massive new DGX systems powered by it. Here’s a rundown of some of the announcements you might have missed.

    Nvidia expands partnership with hyperscalers to boost AI training and development

    March 19, 2024: Nvidia extended its existing partnerships with hyperscalers Amazon Web Services (AWS), Google Cloud Platform, Microsoft Azure, and Oracle Cloud Infrastructure, to make available its latest GPUs and foundational large language models and to integrate its software across their platforms.

    Nvidia launches Blackwell GPU architecture

    March 18, 2024: Nvidia kicked off its GTC 2024 conference with the formal launch of Blackwell, its next-generation GPU architecture due at the end of the year. Blackwell uses a chiplet design, to a point. Whereas AMD’s designs have several chiplets, Blackwell has two very large dies that are tied together as one GPU with a high-speed interlink that operates at 10 terabytes per second, according to Ian Buck, vice president of HPC at Nvidia.

    Cisco, Nvidia target secure AI with expanded partnership

    February 9, 2024: Cisco and Nvidia expanded their partnership to offer integrated software and networking hardware that promises to help customers more easily spin up infrastructure to support AI applications. The agreement deepens both companies’ strategy to expand the role of Ethernet networking for AI workloads in the enterprise. It also gives both companies access to each other’s sales and support systems.

    Nvidia and Equinix partner for AI data center infrastructure

    January 9, 2024: Nvidia partnered with data center giant Equinix to offer what the vendors are calling Equinix Private AI with Nvidia DGX, a turnkey solution for companies that are looking to get into the generative AI game but lack the data center infrastructure and expertise to do it.


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → Surfshark

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Nvidia, Infineon partner for AI data center power overhaul

    Nvidia, Infineon partner for AI data center power overhaul

    Infineon’s Power & Sensor Systems division is teaming up with Nvidia to upgrade the outdated power architecture of AI data centers and replace them with a centralized high-voltage DC power setup.

    With GPUs consuming more than 1 kW of power per chip, the amount of power going into a rack has exploded, and so are the racks. The rate of power failure is increasing due to the growing power burden. Racks have gone from demanding on average 120 kilowatts to 500 kilowatts in just a few years, and they’re expected to reach more than one megawatt before 2030, according to Infineon.

    The fix for this is to throw power supplies at the problem and put many power supplies in one rack. This takes up space and generates heat, and they increased the number of points of failure, according to Infineon.

    The solution is to convert power right at the GPU on the server board and to upgrade the backbone to 800 volts. That should squeeze more reliability and efficiency out of the system while dealing with the heat, Infineon stated.   Nvidia announced the 800 Volt direct current (VDC) power architecture at Computex 2025 as a much-needed replacement for the 54 Volt backbone currently in use, which is overwhelmed by the demand of AI processors and increasingly prone to failure.

    “This makes sense with the power needs of AI and how it is growing,” said Alvin Nguyen, senior analyst with Forrester Research. “This helps mitigate power losses seen from lower voltage and AC systems, reduces the need for materials like copper for wiring/bus bars, better reliability, and better serviceability.”

    Infineon says a shift to a centralized 800 VDC architecture allows for reduced power losses, higher efficiency and reliability. However, the new architecture requires new power conversion solutions and safety mechanisms to prevent potential hazards and costly server downtimes such as service and maintenance.

    “There is no AI without power. That’s why we are working with Nvidia on intelligent power systems to meet the power demands of future AI data centers while providing a serviceable architecture that reduces system downtimes to a minimum,” says Adam White, division president of Power & Sensor Systems at Infineon Technologies in a statement.

    Nvidia is making a full court press on this new 800 Volt backbone, with more than 50 MGX partners are gearing up for along with ecosystem support for Nvidia Kyber, which connects 576 Rubin Ultra GPUs, built to support increasing inference demands.

    At the OCP Global Summit in Germany, some 20-plus industry partners are showcasing new silicon, components, power systems and support for 800-volt direct current (VDC) data centers of the gigawatt era that will support the Nvidia Kyber rack architecture.


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → HomeFi

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Nvidia, Infineon partner for AI data center power overhaul

    Nvidia, Infineon partner for AI data center power overhaul

    Infineon’s Power & Sensor Systems division is teaming up with Nvidia to upgrade the outdated power architecture of AI data centers and replace them with a centralized high-voltage DC power setup.

    With GPUs consuming more than 1 kW of power per chip, the amount of power going into a rack has exploded, and so are the racks. The rate of power failure is increasing due to the growing power burden. Racks have gone from demanding on average 120 kilowatts to 500 kilowatts in just a few years, and they’re expected to reach more than one megawatt before 2030, according to Infineon.

    [ RelatedMore Nvidia news and insights ]

    The fix for this is to throw power supplies at the problem and put many power supplies in one rack. This takes up space and generates heat, and they increased the number of points of failure, according to Infineon.

    The solution is to convert power right at the GPU on the server board and to upgrade the backbone to 800 volts. That should squeeze more reliability and efficiency out of the system while dealing with the heat, Infineon stated.   Nvidia announced the 800 Volt direct current (VDC) power architecture at Computex 2025 as a much-needed replacement for the 54 Volt backbone currently in use, which is overwhelmed by the demand of AI processors and increasingly prone to failure.

    “This makes sense with the power needs of AI and how it is growing,” said Alvin Nguyen, senior analyst with Forrester Research. “This helps mitigate power losses seen from lower voltage and AC systems, reduces the need for materials like copper for wiring/bus bars, better reliability, and better serviceability.”

    Infineon says a shift to a centralized 800 VDC architecture allows for reduced power losses, higher efficiency and reliability. However, the new architecture requires new power conversion solutions and safety mechanisms to prevent potential hazards and costly server downtimes such as service and maintenance.

    “There is no AI without power. That’s why we are working with Nvidia on intelligent power systems to meet the power demands of future AI data centers while providing a serviceable architecture that reduces system downtimes to a minimum,” says Adam White, division president of Power & Sensor Systems at Infineon Technologies in a statement.

    Nvidia is making a full court press on this new 800 Volt backbone, with more than 50 MGX partners are gearing up for along with ecosystem support for Nvidia Kyber, which connects 576 Rubin Ultra GPUs, built to support increasing inference demands.

    At the OCP Global Summit in Germany, some 20-plus industry partners are showcasing new silicon, components, power systems and support for 800-volt direct current (VDC) data centers of the gigawatt era that will support the Nvidia Kyber rack architecture.

    More Nvidia news:


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → EconomyBookings

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • BlackRock’s $40B data center deal opens a new infrastructure battle for CIOs

    BlackRock’s $40B data center deal opens a new infrastructure battle for CIOs

    A consortium led by private equity firm BlackRock is buying Aligned Data Centers for $40 billion. It’s said to be the largest data center deal in history — but more than that, it highlights a market shift that’s putting enterprise CIOs at a strategic disadvantage when it comes to accessing AI infrastructure.

    As private equity and tech giants consolidate ownership of data center capacity, they’re not removing it from the market, but they are fundamentally changing who gets first access, for how much, and on what terms. For enterprise IT leaders, that means competing for capacity after hyperscalers have locked up what they need, often years before it’s even built.

    “Capital has become the gatekeeper of compute, deciding who gets capacity, where, and at what price,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “When ownership changes hands, contracts and pricing often change with it.”

    The consortium acquiring Aligned consists of BlackRock’s Global Infrastructure Partners (GIP), the United Arab Emirates investment fund MGX, and the AI Infrastructure Partnership (AIP), which brings together BlackRock’s GIP and MGX (again), Microsoft, and Nvidia, among other investors. It will control more than 5 gigawatts of data center capacity across 50 campuses in the US and Latin America. The transaction is expected to close in the first half of 2026, subject to regulatory approvals, the consortium said in a statement.

    Control equals pricing power

    Private equity firms have accounted for 80-90% of total data center merger and acquisition activity since 2022, according to a report by Americans for Financial Reform. Transaction values reached $73 billion in 2024, up from $26 billion in 2023, according to Synergy Research Group.

    That consolidation has concentrated capacity in fewer hands, reducing competition and giving operators greater pricing power.

    Everest Group partner Yugal Joshi said, “CIOs are under significant pressure to clearly define their data center strategy beyond traditional one-off leases. Given most of the capacity is built and delivered by fewer players, CIOs need to prepare for a higher-price market with limited negotiation power.”

    The numbers bear this out. Global data center costs rose to $217.30 per kilowatt per month in the first quarter of 2025, with major markets seeing increases of 17-18% year-over-year, according to CBRE. Those prices are at levels last seen in 2011-2012, and analysts expect them to remain elevated.

    Gogia said, “The combination of AI demand, energy scarcity, and environmental regulation has permanently rewritten the economics of running workloads. Prices that once looked extraordinary have now become baseline.”

    Hyperscalers get first dibs

    The consolidation problem is compounded by the way capacity is being allocated. North America’s data center vacancy rate fell to 1.6% in the first half of 2025, with Northern Virginia posting just 0.76%, according to CBRE Research. More troubling for enterprises: 74.3% of capacity currently under construction is already preleased, primarily to cloud and AI providers.

    “The global compute market is no longer governed by open supply and demand,” Gogia said. “It is increasingly shaped by pre-emptive control. Hyperscalers and AI majors are reserving capacity years in advance, often before the first trench for power is dug. This has quietly created a two-tier world: one in which large players guarantee their future and everyone else competes for what remains.”

    That dynamic forces enterprises into longer planning cycles. “CIOs must forecast their infrastructure requirements with the same precision they apply to financial budgets and talent pipelines,” Gogia said. “The planning horizon must stretch to three or even five years.”

    The situation is further complicated by operators rebranding traditional facilities as AI-ready without fundamental infrastructure changes. “Many are rechristening traditional data centers into AI data centers to exploit the rapidly growing demand,” Joshi said. “This is further constraining the industry.”

    New strategies for a constrained market

    Analysts said enterprise IT leaders need to adopt new strategies to navigate the constrained market. Joshi recommends expanding beyond tier-1 data centers to secondary markets and working to secure capacity commitments with service-level agreements around availability, right of first offer, and right of first refusal. “Working with more data center and cloud vendors will help them diversify the risk to an extent,” he said.

    Power availability has emerged as a critical factor too. AI workloads require rack densities of 130 kilowatts currently, with projections reaching 250 kilowatts, according to JLL—far beyond the 40 kilowatts of traditional computing.

    CIOs should also examine existing workloads, Joshi said. “A significant amount of data center capacity is wasted on idle workloads because of poor architecture, underutilized adoption, and suboptimal management. If CIOs can modernize these workloads it can materially reduce the need for raw capacity that their data centers are now unable to meet.”

    Data center decision-making needs to happen at the top, said Gogia. “Infrastructure can no longer sit at the periphery of AI planning; it must sit at the centre of boardroom strategy,” he said. “The enterprises that control their compute environment will shape their own AI destiny. Those that rely on residual access will find that intelligence, like power, always flows to where it is guaranteed first.”

    This article originally appeared on CIO.com.


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → roboform

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Meta details cutting-edge networking technologies for AI infrastructure

    Meta details cutting-edge networking technologies for AI infrastructure

    Meta shared details about its AI and networking advances at this week’s 2025 OCP Global Summit in San Jose, Calif.

    Facebook was a founding member of Open Compute Project (OCP) in 2011, and its now-parent-company Meta has used the annual conference to showcase systems it’s developing to stay on the bleeding edge of technology. At this week’s event, that meant detailing Meta’s AI and networking advances.

    “The advent of AI has changed all our assumptions on how to scale our infrastructure. Building infrastructure for AI requires innovation at every layer of the stack, from hardware and software, to our networks, to our data centers themselves,” wrote Yee Jiun Song, vice president, and Kaushik Veeraraghavan, software engineer, Infra Foundation for Meta, in a blog post about Meta’s AI networking efforts. 

    One of Meta’s themes over the years has been to support open systems development, and it’s continuing that effort. 

    “We have a long way to go in continuing to push open standards. We need standardization of systems, racks and power as rack power density continues to increase. We need standardization of the scale up and scale out network that these AI clusters use so that customers can mix/match different GPUs and accelerators to always use the latest and more cost-effective hardware,” Song and Veeraraghavan wrote. 

    “We need software innovation and standards to allow us to run jobs across heterogeneous hardware types that may be spread in different geographic locations. These open standards need to exist all the way through the stack, and there are massive opportunities to eliminate friction that is slowing down the build out of AI infrastructure.”

    ESUN initiative

    As part of its standardization efforts, Meta said it would be a key player in the new Ethernet for Scale-Up Networking (ESUN) initiative that brings together AMD, Arista, ARM, Broadcom, Cisco, HPE Networking, Marvell, Microsoft, NVIDIA, OpenAI and Oracle to advance the networking technology to handle the growing scale-up domain for AI systems.

    ESUN will focus solely on open, standards-based Ethernet switching and framing for scale-up networking—excluding host-side stacks, non-Ethernet protocols, application-layer solutions, and proprietary technologies. The group will focus on the development and interoperability of XPU network interfaces and Ethernet switch ASICs for scale-up networks, the OCP wrote in a blog.

    ESUN will actively engage with other organizations such as Ultra-Ethernet Consortium (UEC) and long-standing IEEE 802.3 Ethernet to align open standards, incorporate best practices, and accelerate innovation, the OCP stated.

    Data center networking milestones

    The launch of ESUN is just one of the AI networking developments Meta shared at the event. Meta engineers also announced three data center networking innovations aimed at making its infrastructure more flexible, scalable, and efficient:

    • The evolution of Meta’s Disaggregated Scheduled Fabric (DSF) to support scale-out interconnect for large AI clusters that span entire data center buildings.
    • A new Non-Scheduled Fabric (NSF) architecture based entirely on shallow-buffer, disaggregated Ethernet switches that will support our largest AI clusters like Prometheus.
    • The addition of Minipack3N, based on Nvidia’s Ethernet Spectrum-4 ASIC, to Meta’s portfolio of 51Tbps OCP switches that use OCP’s Switch Abstraction Interface and Meta’s Facebook Open Switching System (FBOSS) software stack.

    DSF is Meta’s open networking fabric that completely separates switch hardware, NICs, endpoints, and other networking components from the underlying network and uses OCP-SAI and FBOSS to achieve that, according to Meta. It supports Ethernet-based RoCE RDMA over Converged Ethernet (RoCE/RDMA)) to endpoints, accelerators and NICs from multiple vendors, such as Nvidia, AMD and Broadcom including its own MTIA/accelerator stack. It then uses scheduled fabric techniques between endpoints, particularly Virtual Output Queuing for traffic scheduling to proactively avoid congestion rather than just reacting to it, according to Meta.

    “Over the last year, we have evolved DSF to a 2-stage architecture, scaling to support a non-blocking fabric that interconnects up to 18,432 XPUs,” wrote a group of Meta engineers in a co-authored blog post about the new advances. “These clusters are a fundamental building block for constructing AI clusters that span regions (and even multiple regions) in order to meet the increased capacity and performance demands of Meta’s AI workloads.”

    To its DSF architecture, Meta has added a new architecture called the Non-Scheduled Fabric (NSF), which it says is based on shallow-buffer OCP Ethernet switches to deliver low round-trip latency, the engineers wrote.

    NSF architecture is a three-tier fabric that supports adaptive routing for effective load-balancing. This helps minimize congestion and ensure optimal utilization of GPUs, which is critical for maximizing performance in Meta’s largest AI factories, according to Meta: “NSF supports adaptive routing for effective load-balancing, ensuring optimal utilization and minimizing congestion and serves as foundational building block for Gigawatt-scale AI clusters such as Meta’s Gigawatt-scale AI cluster, Prometheus.”

    Going forward, Meta will utilize both DSF and NSF depending on needs. So, for example, DSF will provide a high-efficiency, highly scalable network for large, but still modular, AI clusters, while NSF will be targeted atthe extreme demands of its largest, gigawatt-scale AI factories such as Prometheus, where low latency and robust adaptive routing are paramount. 

    Meta targeted the optical networking world as well. Last year, it introduced 2x400G FR4 BASE (3-km) optics, the primary solution supporting next-generation 51T platforms across both backend and frontend networks and DSFs. These optics have now been widely deployed throughout Meta’s data centers, the engineers stated: 

    “This year, we are expanding our portfolio with the launch of 2x400G FR4 LITE (500-m) optics. FR4 LITE is optimized for the majority of intra–data center use cases, supporting fiber links up to 500 meters. This new variant is designed to accelerate optics cost reduction while maintaining robust performance for shorter-reach applications.”

    In addition, Meta added the 400G DR4 OSFP-RHS optics — its first-generation DR4 package for AI host-side NIC connectivity. Complementing this, the new 2x400G DR4 OSFP optics are being deployed on the switch side, providing connectivity from host to switch.


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → Surfshark

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Arm joins Open Compute Project to build next-generation AI data center silicon

    Arm joins Open Compute Project to build next-generation AI data center silicon

    Chip designer Arm Holdings plc has announced it is joining the Open Compute Project to help address rising energy demands from AI-oriented data centers.

    Arm said it plans to support companies in developing the next phase of purpose-built silicon and packaging for converged infrastructure. The company said that to build this next phase of infrastructure requires co-designed capabilities across compute, acceleration, memory, storage, networking and beyond.

    The new converged AI data center won’t be built like before, with separate CPU, GPU, networking and memory. They will feature increased density through purpose-built in-package integration of multiple chiplets using 2.5D and 3D technologies, according to Arm.

    Arm is addressing this by contributing the Foundation Chiplet System Architecture (FCSA) specification to the Open Compute Project. FCSA leverages Arm’s ongoing work with the Arm Chiplet System Architecture (CSA) but addresses industry demand for a framework that aligns to vendor- and CPU architecture-neutral requirements.

    To power the next generation of converged datacenters, Arm is contributing its new Foundation Chiplet System Architecture specification to the OCP and broadening the Arm Total Design ecosystem.

    The benefits for OEM partners are power efficiency and custom design of the processors, said Mohamed Awad, senior vice president and general manager of infrastructure business at Arm. “For anybody building a data center, the specific challenge that they’re running into is not really about the dollars associated with building, it’s about keeping up with the [power] demand,” he said.

    Keeping up with the demand comes down to performance, and more specifically, performance per watt. With power limited, OEMs have become much more involved in all aspects of the system design, rather than pulling silicon off the shelf or pulling servers or racks off the shelf.

    “They’re getting much more specific about what that silicon looks like, which is a big departure from where the data center was ten or 15 years ago. The point here being is that they look to create a more optimized system design to bring the acceleration closer to the compute, and get much better performance per watt,” said Awad.

    The Open Compute Project is a global industry organization dedicated to designing and sharing open-source hardware configurations for data center technologies and infrastructure. It covers everything from silicon products to rack and tray design.  It is hosting its 2025 OCP Global Summit this week in San Jose, Calif.

    Arm also was part of the Ethernet for Scale-Up Networking (ESUN) initiative announced this week at the Summit that included AMD, Arista, Broadcom, Cisco, HPE Networking, Marvell, Meta, Microsoft, and Nvidia. ESUN promises to advance Ethernet networking technology to handle scale-up connectivity across accelerated AI infrastructures.

    Arm’s goal by joining OCP is to encourage knowledge sharing and collaboration between companies and users to share ideas, specifications and intellectual property. It is known for focusing on modular rather than monolithic designs, which is where chiplets come in.

    For example, customers might have multiple different companies building a 64-core CPU and then choose IO to pair it with, whether like PCIe or an NVLink. They then choose their own memory subsystem, deciding whether to go HBM, LPDDR, or DDR. It’s all mix and match like Legos, Awad said.

    “What this model allows for is the sort of selection of those components in a differentiation where it makes sense without having to redo all the other aspects of the system which are which are effectively common across multiple different designs,” said Awad.


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → Ecovacs

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • The business case for microsegmentation: Lower insurance costs, 33% faster ransomware response

    The business case for microsegmentation: Lower insurance costs, 33% faster ransomware response

    Network segmentation has been a security best practice for decades, yet for many reasons, not all network deployments have fully embraced the approach of microsegmentation. With ransomware attacks becoming increasingly sophisticated and cyber insurance underwriters paying closer attention to network architecture, microsegmentation is transitioning from nice to have to business imperative.

    New research from Akamai examines how organizations are approaching microsegmentation adoption, implementation challenges, and the tangible benefits they’re seeing. The data reveals a significant gap between awareness and execution, but it also shows clear financial and operational incentives for network teams willing to make the transition. Key findings from Akamai’s Segmentation Impact Study, which surveyed 1,200 security and technology leaders worldwide, include:

    • Only 35% of organizations have implemented microsegmentation across their network environment despite 90% having adopted some form of segmentation.
    • Organizations with more than $1 billion in revenue saw ransomware containment time reduced by 33% after implementing microsegmentation.
    • 60% of surveyed organizations received lower insurance premiums tied to segmentation maturity.
    • 75% of insurers now assess segmentation posture during underwriting.
    • Network complexity (44%), visibility gaps (39%) and operational resistance (32%) remain the primary barriers to adoption.
    • Half of non-adopters plan to implement microsegmentation within two years, while 68% of current users expect to increase investment.

    “I believe the biggest surprise in the data was the effectiveness of microsegmentation when used as a tool for containing breaches,” Garrett Weber, field CTO for enterprise security at Akamai, told Network World. “We often think of segmentation as a set-it-and-forget-it solution, but with microsegmentation bringing the control points to the workloads themselves, it offers organizations the ability to quickly contain breaches.”

    Why traditional segmentation falls short

    Microsegmentation applies security policies at the individual workload or application level rather than at the network perimeter or between large network zones.

    Weber challenged network admins who feel their current north-south segmentation is adequate. “I would challenge them to really try and assess and understand the attacker’s ability to move laterally within the segments they’ve created,” he said. “Without question they will find a path from a vulnerable web server, IoT device or endpoint that can allow an attacker to move laterally and access sensitive information within the environment.”

    The data supports this assessment. Organizations implementing microsegmentation reported multiple benefits beyond ransomware containment. These include protecting critical assets (74%), responding faster to incidents (56%) and safeguarding against internal threats (57%).

    Myths and misconceptions about microsegmentation

    The report detailed a number of reasons why organizations have not properly deployed microsegmentation. Network complexity topped the list of implementation barriers at 44%, but Weber questioned the legitimacy of that barrier.

    “Many organizations believe their network is too complex for microsegmentation, but once we dive into their infrastructure and how applications are developed and deployed, we typically see that microsegmentation solutions are a better fit for complex networks than traditional segmentation approaches,” Weber said. “There is usually a misconception that microsegmentation solutions are reliant on a virtualization platform or cannot support a variety of cloud or kubernetes deployments, but modern microsegmentation solutions are built for simplifying network segmentation within complex environments.”

    Another common misconception is that implementing microsegmentation solutions will impact performance of applications and potentially create outages from poor policy creation. “Modern microsegmentation solutions are designed to minimize performance impacts and provide the proper workflows and user experiences to safely implement security policies at scale,” Weber said.

    Insurance benefits create business case

    Cyber insurance has emerged as an unexpected driver for microsegmentation adoption. The report states that 85% of organizations using microsegmentation find audit reporting easier. Of those, 33% reported reduced costs associated with attestation and assurance. More significantly, 74% believe stronger segmentation increases the likelihood of insurance claim approval.

    For network teams struggling to justify the investment to leadership, the insurance angle can provide concrete financial benefits: 60% of surveyed organizations said they received premium reductions as a result of improved segmentation posture.

    Beyond insurance savings and faster ransomware response, Weber recommends network admins track several operational performance indicators to demonstrate ongoing value.

    Attack surface reduction of critical applications or environments can provide a clear security posture metric. Teams should also monitor commonly abused ports and services like SSH and Remote Desktop. The goal is tracking how much of that traffic is being analyzed and controlled by policy.

    For organizations integrating microsegmentation into SOC playbooks, time to breach identification and containment can offer a direct measure of incident response improvement.

    AI can help ease adoption

    Since it’s 2025, no conversation about any technology can be complete without mention of AI. For its part, Akamai is investing in AI to help improve the user experience with microsegmentation. 

    Weber outlined three specific areas where AI is improving the microsegmentation experience. First, AI can automatically identify and tag workloads. It does this by analyzing traffic patterns, running processes and other data points. This eliminates manual classification work.

    Second, AI assists in recommending security policies faster and with more granularity than most network admins and application owners can achieve manually. This capability is helping organizations implement policies at scale.

    Third, natural language processing through AI assistants helps users mine and understand the significant amount of data microsegmentation solutions collect. This works regardless of their experience level with the platform.

    Implementation guidance

    According to the survey, 50% of non-adopters plan to implement microsegmentation within the next 24 months. For those looking to implement microsegmentation effectively, the report outlines four key steps :

    • Achieve deep, continuous visibility: Map workloads, applications and traffic patterns in real time to surface dependencies and risks before designing policies
    • Design policies at the workload level: Apply fine-grained controls that limit lateral movement and enforce zero-trust principles across hybrid and cloud environments
    • Simplify deployment with scalable architecture: Adopt solutions that embed segmentation into existing infrastructure without requiring a full network redesign
    • Strengthen governance and automation: Align segmentation with security operations and compliance goals, using automation to sustain enforcement and accelerate maturity

    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → EHarmony

    FTC Disclosure: This post contains affiliate links. We may earn a commission if you purchase through these links at no additional cost to you. See our Affiliate Disclosure for details.

  • Electrifying Everything Will Require Multiphysics Modeling

    Electrifying Everything Will Require Multiphysics Modeling

    A prototyping problem is emerging in today’s efforts to electrify everything. What works as a lab-bench mockup breaks in reality. Harnessing and safely storing energy at grid scale and in cars, trucks, and planes is a very hard problem that simplified models sometimes can’t touch.

    “In electrification, at its core, you have this combination of electromagnetic effects, heat transfer, and structural mechanics in a complicated interplay,” says Bjorn Sjodin, senior vice president of product management at the Stockholm-based software company COMSOL.

    COMSOL is an engineering R&D software company that seeks to simulate not just a single phenomenon—for instance, the electromagnetic behavior of a circuit—but rather all the pertinent physics that needs to be simulated for developing new technologies in real-world operating conditions.

    Engineers and developers gathered in Burlington, Mass. on 8-10 Oct. for COMSOL’s annual Boston conference, where they discussed engineering simulations via multiple simultaneous physics packages. And multiphysics modeling, as the emerging field is called, has emerged as a component of electrification R&D that is becoming more than just nice-to-have.

    “Sometimes, I think some people still see simulation as a fancy R&D thing,” says Niloofar Kamyab, a chemical engineer and applications manager at COMSOL. “Because they see it as a replacement for experiments. But no, experiments still need to be done, though experiments can be done in a more optimized and effective way.”

    Can Multiphysics Scale Electrification?

    Multiphysics, Kamyab says, can sometimes be only half the game.

    “I think when it comes to batteries, there is another attraction when it comes to simulation,” she says. “It’s multi-scale—how batteries can be studied across different scales. You can get in-depth analysis that, if not very hard, I would say is impossible to do experimentally.”

    In part, this is because batteries reveal complicated behaviors (and runaway reactions) at the cell level but also in unpredictable new ways at the battery-pack level as well.

    “Most of the people who do simulations of battery packs, thermal management is one of their primary concerns,” Kamyab says. “You do this simulation so you know how to avoid it. You recreate a cell that is malfunctioning.” She adds that multiphysics simulation of thermal runaway enables battery engineers to safely test how each design behaves in even the most extreme conditions—in order to stop any battery problems or fires before they could happen.

    Wireless charging systems are another area of electrification, with their own thermal challenges. “At higher power levels, localized heating of the coil changes its conductivity,” says Nirmal Paudel, a lead engineer at Veryst Engineering, an engineering consulting firm based in Needham, Mass. And that, he notes, in turn can change the entire circuit as well as the design and performance of all the elements that surround it.

    Electric motors and power converters require similar simulation savvy. According to electrical engineer and COMSOL senior application engineer Vignesh Gurusamy, older ways of developing these age-old electrical workhorse technologies are proving less useful today. “The recent surge in electrification across diverse applications demands a more holistic approach as it enables the development of new optimal designs,” Gurusamy says.

    And freight transportation: “For trucks, people are investigating, Should we use batteries? Should we use fuel cells?” Sjodin says. “Fuel cells are very multiphysics friendly—fluid flow, heat transfer, chemical reactions, and electrochemical reactions.”

    Lastly, there’s the electric grid itself. “The grid is designed for a continuous supply of power,” Sjodin says. “So when you have power sources [like wind and solar] shutting off and on all the time, you have completely new problems.”

    Multiphysics in Battery and Electric Motor Design

    Taking such an all-in approach to engineering simulations can yield unanticipated upsides as well, says Kamyab. Berlin-based automotive engineering company IAV, for example, is developing powertrain systems that integrate multiple battery formats and chemistries in a single pack. Sodium ion cannot give you the energy that lithium ion can give,” Kamyab says. “So they came up with a blend of chemistries, to get the benefits of each, and then designed a thermal management that matches all the chemistries.”

    Jakob Hilgert, who works as a technical consultant at IAV, recently contributed to a COMSOL industry case study. In it, Hilgert described the design of a dual-chemistry battery pack that combines sodium-ion cells with a more costly lithium solid-state battery.

    Hilgert says that using multiphysics simulation enabled the IAV team to play the two chemistries’ different properties off of each other. “If we have some cells that can operate at high temperatures and some cells that can operate at low temperatures, it is beneficial to take the exhaust heat of the higher-running cells to heat up the lower-running cells, and vice versa,” Hilgert said. “That’s why we came up with a cooling system that shifts the energy from cells that want to be in a cooler state to cells that want to be in a hotter state.”

    According to Sjodin, IAV is part of a larger trend in a range of industries that are impacted by the electrification of everything. “Algorithmic improvements and hardware improvements multiply together,” he says. “That’s the future of multiphysics simulation. It will allow you to simulate larger and larger, more realistic systems.”

    According to Gurusamy, GPU accelerators and surrogate models allow for bigger jumps in electric motor capabilities and efficiencies. Even seemingly simple components like the windings of copper wire in a motor core (called stators) provide parameters that multiphysics can optimize.

    “A primary frontier in electric motor development is pushing power density and efficiency to new heights, with thermal management emerging as a key challenge,” Gurusamy says. “Multiphysics models that couple electromagnetic and thermal simulations incorporate temperature-dependent behavior in stator windings and magnetic materials.”

    Simulation is also changing the wireless charging world, Paudel says. “Traditional design cycles tweak coil geometry,” he says. “Today, integrated multiphysics platforms enable exploration of new charging architectures,” including flexible charging textiles and smart surfaces that adapt in real-time.

    And batteries, according to Kamyab, are continuing a push toward higher power densities and lower price points. Which is changing not just the industries where batteries are already used, like consumer electronics and EVs. Higher-capacity batteries are also driving new industries like electric vertical take-off and landing aircraft (eVTOLs).

    “The reason that many ideas that we had 30 years ago are becoming a reality is now we have the batteries to power them,” Kamyab says. “That was the bottleneck for many years. … And as we continue to push battery technology forward, who knows what new technologies and applications we’re making possible next.”


    🛸 Recommended Intelligence Resource

    As UAP researchers and tech enthusiasts, we’re always seeking tools and resources to enhance our investigations and stay ahead of emerging technologies. Check out this resource that fellow researchers have found valuable.

    → Ecovacs