Minnesota companies and workers cache in on big data

Kossi Gavi drives to class on Sunday afternoons to learn retail software, and the reason is simple.

People who wield computers to analyze large amounts of digital information are in high demand, and Gavi is learning a program that chain stores worldwide use to run their businesses. Workers who know the program can earn up to $80,000 per year.

“It’s a very good program if you want more opportunities to make more money,” said the 43-year-old former refugee from Togo, now a few months from earning a bachelor’s degree at Metro State University.

Businesses today control massive and growing streams of information that flow from cash registers, patient records, smartphones, warehouses, the sensors in your Nikes, databases, Facebook and good old-fashioned loyalty cards.

The challenge is finding people who can put it all together and make better strategy. Everyone from the Central Intelligence Agency to Gander Mountain is on the hunt.

“I would challenge you to describe to me an organization of any size in any industry or not-for-profit setting that will not be leveraging this,” said Isaac Cheifetz, a headhunter working to find the Mayo Clinic a head of information management and analytics. “Name one. I can’t.”

Businesses have the data to keep sale racks thin, streamline shipping and get more people to click ads. What they need is better analysts. It’s a new kind of job, and it’s coming to your workplace if it’s not already there.

The McKinsey Institute predicted in 2011 that a big data boom would create up to 190,000 new deep analytics positions in the United States, and demand for 1.5 million data-savvy managers.

If you can run Hadoop — open-source software used by Google, Yahoo and Facebook to analyze the deluges of information churned out by the Internet — you might get a free flight to the Bay Area for a job interview, said Ravi Bapna, director of the University of Minnesota’s Social Media and Business Analytics Collaborative.

“The premium for these sort of people is already very high, and it will only increase over time,” Bapna said. “There is a huge shortage of people who can handle the data, who have the business acumen to be able to ask the right questions, to do the experiments and make the right inferences.”

Big data refers to a series of software and hardware advances, but one of its biggest advances is a new ability to impose structure on vast pools of complex information — like pictures, consumer preferences, geographic locations and video of the ocean.

Traditional data is — to use one of Cheifetz’s analogies — like the zoo. Think Excel spreadsheets. Everything has a label and fits in a format where it can be easily sorted.

But the world is awash in the unstructured information of the Internet, mobile phones, social media. Instead of a zoo, this information spreads out like a nature preserve. It’s moving, wild, and can’t be captured in database cells.

A major achievement of big data has been its ability to sort the unstructured information of the preserve, and to do complex analysis of huge amounts of information in parallel on several machines.

Software like Hadoop makes it possible to analyze, for instance, a photo, attach a digital signature to the photo that describes it, and to compare that signature to the signatures of extremely large numbers of other photos.

Thus unstructured data gets structure, and analyzing huge amounts of information becomes practical. Similar approaches are used to analyze buying behavior, what types of ads people respond to, even fraud.

Retail companies like Best Buy and Target are keenly interested. “Retail’s been big data for decades,” said Mike Webster, general manager of Oracle Retail.

Oracle sells heavy-duty software that allows companies to track purchasing, supply chain, shipping, inventory and sales in stores and on the Internet.

Now, companies want to wed that type of data with information on where customers are, what they want, what they’re saying on their social network, and how and when to ship products to them. As more retailers try to harness all that information, Oracle has been doing more business.

“It gets real complicated real fast,” Webster said. “Our approach is to try to simplify that as best we can.”

All of the top 20 retailers in the world use Oracle, which has a presence in Minneapolis because it acquired the local company Retek in 2005. The company’s client list includes Best Buy, Gander Mountain, Scheels and Von Maur.

Data without end

In the fourth quarter of 2012, Minnesota employers had about 1,450 openings for computer systems analysts, software architects, database administrators and related positions.

Many of those jobs will be for people who can corral and analyze data, and that doesn’t include some of the analytics jobs coming open at utilities, marketing firms and human resources consultancies.

Meanwhile, the world keeps churning out data. Worldwide mobile data usage — mostly smartphone traffic — grew 70 percent in 2012. It was 885 petabytes per month, or 12 times more than all of global Internet traffic in 2000, according to Cisco.

The Carlson School has been offering data science electives at the U for eight years, and now wants to start a master’s program in business analytics and data science. The proposal, OK’d by the Carlson faculty, awaits approval from the Board of Regents.

Bapna’s vision is broader than teaching a particular software program.

He thinks the data generated by social media is an unprecedented social research graph, “a global laboratory, where we can ask fundamental questions about human behavior.”

He wants to build a master’s program that sculpts people with the technical training and business savvy to ask the clever questions and write the clever computer code that yields profitable insight.

Universities like Stanford, MIT, California-Berkeley, Harvard and Carnegie Mellon already run programs that kick out data scientists, and several other universities are shifting resources toward such training. Minnesota’s master’s program could start as early as the fall of 2014.

New skills in demand

For now, people tend to fall into big data jobs accidentally, Cheifetz said. Someone gets a Ph.D. in math, and ends up working on algorithms for Wall Street. An above-average IT manager learns the new software and takes ownership.

But that will change. Companies will begin to seek out workers with a strong background in computer science and statistics, and experience running predictive models, Cheifetz said. Also important is the ability to translate the data into a clear narrative. “We’re almost talking about a computer science and statistics undergrad, with a minor in theater so they can talk to people,” Cheifetz said.

The independent study class at Metro State University was thrown together with the help of Logic Information Systems, a local firm that consults for companies that use Oracle Retail.

Advance IT and the state of Minnesota have pushed the Oracle Retail training since November, when executives from Logic, Best Buy, Gander Mountain, Scheels and Von Maur told Gov. Mark Dayton that about 150 jobs are available running the software, with wages easily at $80,000 a year.

Source: startribune.com/

For Minneapolis web design price, please call: +1 (612) 590-8080


Platinum leading the charge in precious metals

Precious metals in New York were higher across the board, with platinum leading the charge…higher by another $19 an ounce in the April contract at 1,715 – right near contract highs continuing its bullish momentum on the fact of possible disruptions in supply that could happen this year. That is also pushing palladium prices to new yearly highs once again, up $14 at 772…continuing its bullish momentum on the fact that automobile industry is doing extremely well – which is propping up demand for these products.

As I’ve stated in many previous blogs, I do believe that platinum and palladium are headed sharply higher from these levels.

Gold futures were very quiet, basically unchanged for the trading session at 1,649 an ounce…and is still trading in a sideways to lower pattern with really no trend in sight. That is pushing silver prices slightly higher today, by $.08 at 30.98 in a lackluster trade in New York. Silver is still stuck in a sideways channel at this point, and I’m still advising traders to be long this entire complex because I do believe silver will join the party in the next couple of weeks – especially if platinum and palladium continue to make new highs.

Copper futures for the March contract closed up about 200 points at 3.7450, still right at the upper end of the range on the fact that the housing market is improving. Also with the S&P 500 hitting a new 5-year high again today, which is pushing up optimism that demand for copper will increase in the coming months.


There is a substantial risk of loss in futures, futures option and forex trading. Furthermore, Seery Futures is not responsible for the accuracy of the information contained on linked sites. Trading futures and options is Not appropriate for every investor.

Minnesota drops out of St. Louis River mercury project

The state of Minnesota has abruptly pulled out of a four-year, $1 million research project to identify the sources of mercury pollution in the St. Louis River on the Iron Range, a decision that stunned the Fond du Lac Band of Chippewa and dismayed federal regulators.

The mercury contamination, which makes much of the river’s game fish inedible for children and young women, is particularly worrisome because 1 in 10 infants on the North Shore of Lake Superior have been found to have unsafe levels of mercury in their blood. About 1 in 100 have levels high enough to harm neurological function, according to state health officials. The river’s estuary is also a critical breeding ground for fish in western Lake Superior.

Officials from the Minnesota Pollution Control Agency (PCA) said they are committed to reducing mercury pollution in the river and in the 10 percent of the state’s waters that have unusually high levels. But, they said, the agency’s lead scientists believe the state first needs more research on how mercury behaves in nature and why mercury levels in fish from the St. Louis River are significantly higher in than those elsewhere.

Others with a stake in cleaning up the river say that sources of mercury are well-known: a combination of air emissions from power and taconite plants, and sulfate pollution. They say the federally funded research project would have provided some badly needed answers.

“The St. Louis River is [our] single most important fishing source,” said Nancy Schuldt, water project coordinator for the Fond du Lac Band of Lake Superior Chippewa, whose members rely on the river as a source of food. “We simply can’t walk away from this.”

This week Water Legacy, an environmental advocacy group, asked the U.S. Environmental Protection Agency (EPA) to hold a public meeting to discuss the project and the state’s actions. The public and other interested groups deserve a chance to weigh in on the decision, said Paula Maccabee, Water Legacy’s attorney.

“Stepping away is irresponsible,” she said.

Largest estuary

The St. Louis River drains Minnesota’s Iron Range before spreading out into the nation’s largest freshwater estuary between Duluth and Superior, Wis. — the primary incubator for aquatic life in western Lake Superior. The estuary, contaminated by a century of industrial activity, is a primary focus of the EPA and has received millions of dollars for remediation through the federally funded Great Lakes Initiative.

The state first tackled the river’s mercury problem more than 10 years ago, when it launched an effort under the federal Clean Water Act to identify the sources of pollution and to develop a plan to fix it. But that was halted when the PCA opted to first develop a statewide plan to reduce mercury emissions, making Minnesota one of the first states to do so. Under that plan, which calls for gradually reducing Minnesota’s mercury air emissions by two-thirds, 90 percent of the state’s lakes and rivers would eventually see fish contamination levels fall to safe levels.

But that’s not true for about 10 percent of the state’s lakes and rivers. For reasons that are not well understood, they are mercury “hot spots” where levels are significantly higher, and in the St. Louis River and the estuary they are among the highest of all.

Four years ago the EPA offered money from Great Lakes Initiative to address problems in the St. Louis River, creating a partnership between the federal government, Minnesota, Wisconsin and the Fond du Lac band to manage it. The project, which has cost just under $1 million to date, would have compiled reams of environmental data held by the state and industry, plus new data collected from the river and its tributaries.

It also would have included new research on how mercury behaves in the environment from a major effort partially funded by the taconite industry that is now underway at the Minnesota Department of Natural Resources. The industry has a keen interest because the research is designed to address whether sulfate, a dissolved mineral that flows from water treatment plants and iron ore pits, plays a role in transforming mercury into a form that gets into the food chain and eventually accumulates in predator fish like northerns and walleye.


The problem, according to state documents and e-mails, was the computer modeling program that the St. Louis River project would used to crunch data and identify the sources of mercury. A year ago, state scientists raised questions internally about using the computer model for the mercury project, called a TMDL.

Then in February, when it came time for all the partners to sign off on the mercury research plan, Minnesota officials suddenly informed the EPA and others that the state would no longer participate because they feared that the study would produce unreliable results.

“There is something different about these waters,” Shannon Lotthammer, director of PCA’s environmental analysis division, said in an interview this week. She said the agency’s top scientists believe the computer analysis uses assumptions that might be inaccurate, and that could lead to decisions “that won’t solve the problem.” Lotthammer added, however, that the state is willing to work with the EPA and the other partners in addressing other pollutants in the St. Louis River.

But Alie Muneer, the Chicago EPA official leading the project, said Minnesota officials had never before expressed its concerns and that she was surprised and puzzled by the suddenness and “the magnitude” of the decision.

“With any TMDL there is always a level of uncertainty involved,” Muneer said.

She also pointed out that mercury assessments have been successfully completed on rivers in other states — some with less information than would be used in the St. Louis River assessment.

As designed, it “would have produced a scientifically defensible TMDL,” she said.

Schuldt said she knew PCA scientists had concerns about the computer model, one of several that would have been used. But she did not know that “it was a deal breaker,” and state officials never suggested an alternative, she said. “We all felt a little stunned.”

In a March letter to the EPA, Commissioner John Linc Stine of the PCA said that Minnesota is committed to resolving the mercury problem, that it is putting together a research proposal that will build on the sulfate research the DNR is conducting to better understand the chemistry and that it will seek funding for it as it goes along.

Muneer said she hopes differences can still be resolved at a May meeting about the project and the computer model.

But Schuldt said that if the project stalls, the EPA is not likely to devote more resources to finding ways to reduce mercury in the river and that she finds the prospect disheartening.

Source: startribune.com