This tech job has just been crowned the best in the country | ZDNet

Tech and STEM-related jobs offer the best opportunity for jobhunters and career-switchers to find satisfaction in their work, according to new careers data from Glassdoor.

Glassdoor’s 25 Best Jobs List for 2022 found that software developers, data scientists, IT architects and full-stack engineers dominate the top spots when measured by salary, job satisfaction and the number of job openings.

These three factors were taken from ratings from “hundreds of thousands” of employee reviews on the Glassdoor platform, which were combined with the number of role vacancies to create an overall ‘job score’.

Source: This tech job has just been crowned the best in the country | ZDNet

Ask Watson or Siri: Artificial intelligence is as elusive as ever

In 1966, some MIT researchers reckoned that they could develop computer vision as a summer project, perhaps even get a few smart undergrads to complete the task.

The world has been working on the problem ever since.

Computer vision is where computers recognize objects like people do. That’s a tree. He’s Carlos. And so on. It’s one of a number of tasks we consider essential for generalized artificial intelligence, in which machines can act and reason as humans do.

While we’ve been making some considerable headway in computer vision, especially in recent years, that it has taken 50 years longer than expected shows why AI (artificial intelligence) is such a difficult and elusive goal.

via Ask Watson or Siri: Artificial intelligence is as elusive as ever | Computerworld.

Google’s big-data tool, Mesa, holds petabytes of data across multiple servers

Google says its big-data architecture, Mesa, can store petabytes of data, update millions of rows of data per second, and field trillions of queries daily across multiple servers, enabling continuous operation of the data warehouse even if a data center fails. “Mesa ingests data generated by upstream services, aggregates and persists the data internally, and serves the data via user queries,” note Google researchers. They say Mesa was originally constructed to house and analyze critical measurement data for Google’s Internet advertising business, but the technology could be applicable to other, similar data warehouse tasks. Mesa is dependent on other Google technologies, such as the Colossus distributed file system, the BigTable distributed data storage system, and the MapReduce data analysis framework. Google engineers implemented Paxos, a distributed synchronization protocol, to help address query consistency issues. Mesa also can operate on generic servers, making costly specialized hardware unnecessary and enabling Mesa to be run as a cloud service with the advantage of scalability.

Google’s big-data tool, Mesa, holds petabytes of data across multiple servers – Computerworld.

Career alert: A Master of analytics degree is the ticket

The toughest part about earning a Master of Science in Analytics at North Carolina State University NCSU may be deciding which job to accept.The 75 students in the class of 2014, which is nearing graduation, received, in total, 246 job offers from 55 employers.Added together, the starting salaries and bonuses offered to grads of the university’s Institute of Advanced Analytics reached $22.5 million, which is 24% higher than last year’s combined offers. This is an analytics program, after all; they keep track of these things.This meant that lot of employers went home unhappy, unable to get the candidate they were after, despite offering nearly six-figure salaries on average — and bonuses as well.High demand by employers is also boosting applications, and that’s reducing acceptance rates.

via Career alert: A Master of analytics degree is the ticket — if you can get into class – Computerworld.

DARPA looks to GPUs to help process big data in the military

The US defence agency is appealing for developers to re-purpose XDATA cloud for military decision-making

The Defense Advanced Research Projects Agency (DARPA) is looking to GPUs to help them master big data in support of governmental and military efforts.

Chris White, project manager at DARPA, told attendees at the GPU Technology Conference the agency is looking for people to help them understand real-world battlefields using GPUs.

DARPA has more than a dozen data science projects on the go, but at present, only two of them use GPUs – one of which is the agency’s XDATA cloud, which develops ways to process and analyse large data sets.

via DARPA looks to GPUs to help process big data in the military | IT PRO.

Experts Explain Why Big Data is a Big Deal

Expert speakers participating in a recent seminar at the University of California, San Diego discussed the rapid growth of big data and how it is affecting people’s daily lives. California Institute for Telecommunications and Information Technology director Larry Smarr pointed to a typical Google search on a smartphone, whose operation requires more computing power than all of the Apollo space missions put together. “Never in our history have we had a sustained period of this kind of exponential growth [in computer science],” Smarr said. “What we’re talking about is something humanity has never tried to deal with before.” The key theme of the seminar was speculation on the future changes that big data will usher in. Fellow speaker and San Diego Supercomputer Center director Michael Norman discussed the center’s Gordon supercomputer, which is a repository that moves, houses, and analyzes data with vast volumes of flash-based memory. The research areas Gordon is used for include climatology, finance, food production, big industry, physics, biological science, and government. Norman says the three central functions of big data are the volume of data, the speed of information produced, and the variety of data that is readily available.

Experts Explain Why Big Data is a Big Deal.