New study shows that Thompson sampling can be naturally combined with a classical linear program formulation to include inventory constraints.
In 1933, William R. Thompson published an article on a Bayesian model-based algorithm that would ultimately become known as Thompson sampling. This heuristic was largely ignored by the academic community until recently, when it became the subject of intense study, thanks in part to internet companies that successfully implemented it for online ad display.Thompson sampling chooses actions for addressing the exploration-exploitation in the multiarmed bandit problem to maximize performance and continually learn, acquiring new information to improve future performance.In a new study, “Online Network Revenue Management Using Thompson Sampling,” MIT Professor David Simchi-Levi and his team have now demonstrated that Thompson sampling can be used for a revenue management problem, where demand function is unknown.
MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times over its predecessors, while reducing power consumption 93 to 96 percent. That could make it practical to run neural networks locally on smartphones or even to embed them in household appliances. New chip reduces neural networks’ power consumption by up to 95 percent, making them practical for battery-powered devices.
Most recent advances in artificial-intelligence systems such as speech- or face-recognition programs have come courtesy of neural networks,
densely interconnected meshes of simple information processors that learn to perform tasks by analyzing huge sets of training data.But neural nets are large,
and their computations are energy intensive, so they’re not very practical for handheld devices. Most smartphone apps that rely on neural nets simply upload data
to internet servers, which process it and send the results back to the phone.
Now, MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times over its predecessors, while reducing power consumption 94 to 95 percent. That could make it practical to run neural networks locally on smartphones or even to embed them in household appliances.
“The general processor model is that there is a memory in some part of the chip, and there is a processor in another part of the chip, and you move the data back and forth between them when you do these computations,” says Avishek Biswas, an MIT graduate student in electrical engineering and computer science, who led the new chip’s development.“Since these machine-learning algorithms need so many computations, this transferring back and forth of data is the dominant portion of the energy consumption.
But the computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don’t need to transfer this data back and forth?”Biswas and his thesis advisor, Anantha Chandrakasan, dean of MIT’s School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, describe the new chip in a paper that Biswas is presenting this week at the International Solid State Circuits Conference.
An illustration of real-world behavioral commonalities in raw data of transactions. Platform analyzes big data to answer plain-language business queries in minutes instead of months.
Companies often employ number-crunching data scientists to gather insights such as which customers want certain services or where to open
new stores and stock products. Analyzing the data to answer one or two of those queries, however, can take weeks or even months.
Now MIT spinout Endor has developed a predictive-analytics platform that lets anyone, tech-savvy or not, upload raw data and input any business question into an interface — similar to using an online search engine — and receive accurate answers in just 15 minutes.
The platform is based on the science of “social physics,” co-developed at the MIT Media Lab by Endor co-founders Alex “Sandy” Pentland, the Toshiba Professor of Media Arts and Sciences, and Yaniv Altshuler, a former MIT postdoc. Social physics uses mathematic models and machine learning to understand and predict crowd behaviors.
Users of the new platform upload data about customers or other individuals, such as records of mobile phone calls, credit card purchases, or web activity. They use Endor’s “query-builder” wizard to ask questions, such as “Where should we open our next store?” or “Who is likely to try product X?” Using the questions, the platform identifies patterns of previous behavior among the data and uses social physics models to predict future behavior. The platform can also analyze fully encrypted data-streams, allowing customers such as banks or credit card operators to maintain data privacy.
“It’s just like Google. You don’t have to spend time thinking, ‘Am I going to spend time asking Google this question?’ You just Google it,” Altshuler says. “It’s as simple as that.” Financially backed by Innovation Endeavors, the private venture capital firm of Eric Schmidt, executive chairman of Google parent company Alphabet, Inc., the startup has found big-name customers, such as Coca-Cola, Mastercard, and Walmart, among other major retail and banking firms. Recently, Endor analyzed Twitter data for a defense agency to detect potential terrorists. Endor was given 15 million data points containing examples of 50 Twitter accounts of identified ISIS