A new hotel chain offers "guests a chance to re-live the college experience," reports Craig Karmin in The Wall Street Journal (8/27/14). Graduate Hotels "is targeting college towns across the US." They "won’t resemble beer-soaked fraternity houses or impersonal dormitories" but they hope to "appeal to folks coming back to college to watch sporting events, attend reunions or show the campus to their children." The chain is believed to be the first "to target exclusively college areas."
Each hotel "will have a bar and restaurant, locally inspired art collections and 100 to 150 rooms with handcrafted items and rates slightly above the area’s limited-service hotels," which typically include Hilton and Marriott. "These towns are seeing a renaissance," says Christian Strobel of Graduate Hotels. "They are often state capitals or cultural hubs for a state, and they attract entrepreneurial companies by offering an alternative to big cities." Graduate also hopes to cater to "people doing business with the universities or with other firms in town."
In Athens, Georgia, the Graduate Hotel "will include vintage ceramic lamps in the shape of the University of Georgia’s bulldog mascot, while album covers from REM and the B-52s, bands that got their start in the southern town, will adorn the walls. So will photos of Italian fashion designer Emilio Pucci, who studied agriculture at the school. The Tempe, Arizona, hotel "will feature an ant farm behind the front desk, a nod to Arizona State’s popular social insects department." Plans are "to open 20 of these hotels over the next five years."
The Industrial Revolution was a key factor in the advent of college football, reports Steve Almond in a Wall Street Journal review of The Opening Kickoff by Dave Revsine (8/27/14). In part this was because people had more leisure time, but it was also because men no longer had "the satisfyingly manly behaviors performed by their forebears. The resulting yearning for virility found its expression in movements such as Muscular Christianity and on college playing fields."
Schools, meanwhile, especially those "sprouting up across the Midwest, quickly seized on football as an ideal vehicle for promotion." The University of Chicago, intent on "founding a school to rival the Ivy Leagues … hired a former football star named Amos Alonzo Stagg as coach and gave him an express directive (in addition to a starting salary twice that of the average professor)" to make this happen. "Stagg was hard at work before the buildings on campus even had doorknobs."
In the early days the game was so brutal it was banned for a time by Harvard and others. Heavyweight boxing champ John L. Sullivan said, "There’s murder in that game." After "more than a dozen players died" in the 1905 season, President Teddy Roosevelt "convened a summit at the White House" to address "the game’s excesses." A committee was formed that ultimately "legalized the forward pass, an innovation that shifted the emphasis of the game away from mass collisions and thus curtailed much of its violence."
In his latest book, Greil Marcus distills the history of rock ‘n’ roll into just 10 songs, reports Wesley Stace in The Wall Street Journal (8/23/14). In part, Greil picks his list based on certain themes, like "escape," that are particular to rock ‘n’ roll. But the overarching idea "is how different artists cover, and re-discover, one another’s work." For example, In the Still of the Night, by the Five Satins, "has hit the Billboard charts in four different decades." Phil Spector’s To Know Him to Love Him was written for the Teddy Bears and covered by Amy Winehouse, among others.
In addition to those two songs, Greil’s ten songs are: Shake Some Action; Transmission; All I Could Do Was Cry; Crying, Waiting, Hoping; Money (That’s What I Want); Money Changes Everything; This Magic Moment; and Guitar Drag. Chances are you don’t know at least some of these works, and note that none was written by Bob Dylan or The Beatles. However, Greil circles "back to the big guns," noting, for example, that the Beatles don’t cover Buddy Holly so much as they conduct "a kind of séance with him."
Greil also connects Ben E. King’s rendition of This Magic Moment to "Rabbit Brown, serenading sweethearts on Lake Pontchartrain in 1927" and back again to Lou Reed‘s performance of the song in David Lynch’s Lost Highway. The connection between cinema and rock ‘n’ roll is also explored, suggesting that "the screen, silver or small, is where rock really happens." And he quotes Bob Dylan’s observation about Like A Rolling Stone: "a ghost is writing a song like that, it gives you the song and it goes away."
Maya Beiser channels Janis Joplin, Kurt Cobain, Jimi Hendrix and others through her cello, reports Corinne Ramey in The Wall Street Journal (8/26/14). Her idea is to play their “voice.” With Kurt Kobain, for instance, Maya produces a “grimy, fuzzy howl.” “His voice is raspy and out of tune and not clean,” says Maya. Getting at that “is about … diving into that world.” Similarly, Jimi Hendrix’s solo on Little Wing is “so out of tune … You have to learn to bend and sort of be around the note.”
“All of this insane perfectionism that we’re taught, none of that exists in rock,” says Maya, who says her revelation came as a teenager, hearing Janis Joplin sing. “It was this revelation, that someone can be so raw and give it all,” she says. Her new record, Uncovered, layers “as many as 20 cello tracks on top of one another,” turning “one acoustic cello into a rock band.” Maya also detunes the cello and uses “certain kinds of plucking to create a rock-band palette of sounds.”
Arranged in collaboration with composer and professor Evan Ziporyn of MIT, the songs are reduced “down to their essence, to expose this really beautiful thing,” says Evan. “It’s too perfect if you just imitate,” he says. Maya says the result is not ‘pop,’ which she defines as “formulaic” and “makes lots and lots of money.” “Every artist on this album is anything but formulaic,” she says. “I think this music is just as valuable and important as Bach and Schubert.” She explains and demonstrates further here.
Zippo is doing better than ever even though the number of US smokers is half what it was in the 1950s, reports Abram Brown in Forbes (9/8/14). Last year’s sales topped $200 million, a record. Zippo’s claim to fame is, of course, its innovative and iconic cigarette lighter — developed "with a windproof chimney and a distinctive hinged lid" — in 1932. "After soldiers received the lighters in WWII, Zippo successfully marketed itself with a utilitarian, made-in-America image for the following half century."
Each lighter came with a lifetime guarantee, "meaning Zippo would continue to fix the lighter as long as its owner sent it to the factory." This apparently worked for Frank Sinatra, who "was buried with his trusty Zippo in 1998." It hasn’t worked so well for younger consumers "who were children when Sinatra died." The key to Zippo’s renewed success is largely its positioning "as a maker of talismans, lucky charms — or something akin to customized belt buckles" — and the "30,880 unique designs" it produced last year.
That’s "up from 8.900 a decade ago … partly owing to a new Zippo.com feature where you can design your own lighter from scratch." Zippo has also expanded into China, opening 14 retail stores there, "riding the idea of Zippo as an all-American lifestyle brand. The stores carry a Zippo-designed clothing line." Zippo has two stores in Las Vegas, as well, and has further line-extended into camping gear. "This is just a metal box," says George Duke, Zippo’s third-generation owner, adding: "There’s a lot you can do with a metal box."
D’Addario succeeds "by experimenting with a commodity good and refining it through small, but significant, innovations," reports Karsten Strauss in Forbes (9/8/14). The commodity is musical instrument strings, which it turns out at a rate of "some 700,000 per day." This "netted an estimated $12 million on $169 million," including other accessories, and growth at a rate averaging "6.2% a year during the past decade." It is a long way from the company’s roots in "17th-century" Italy, and its US entry in 1905.
As recently as the 1950s, Charles D’Addario worked out of his basement in Queens, New York, "where sour-smelling animal intestines stretched on racks were twisted into strings bound for violins, cellos and harps." He’d then sell his wares "out of his car to luthiers and players from Boston to Washington DC … It was Charles’ son, John … who recognized the benefits of synthetic materials, like DuPont’s new creation, nylon, invented in 1935." After Elvis happened, John split off to manufacture steel strings, for the electric guitar.
John’s chief innovation was to make the "string’s steel cores hexagon-shaped instead of cylindrical, which gave the wires wrapped around them something to hold onto, creating a stable string that rang true." John’s son Jim, the current CEO, says automated equipment is critical: "Whenever a major innovation was developed, we would retrofit the entire fleet of machinery," he says. The inherently disposable nature of the strings is another key, because "they wear out and need to be replaced frequently."
Amid the rise of Big Data is a decline in reliable research, reports David Leonhardt in The New York Times (8/26/14). "The declining response rate to surveys of almost all kinds is among the biggest problems in the social sciences," David writes. "It’s complicating our ability to understand how people live and what they believe." This is evident in the Labor Department’s jobs report, which some believe has "become less accurate over the last two decades, in part because of this rise in non-response."
The response rate of the jobs report (about 89%) is actually relatively high compared to, say, political surveys (9%). Both are down, however — in the 1980s the jobs report response rate was 96 percent, and in 1997 "the response rate to a typical telephone poll was a healthy 36 percent, according to Pew." Another issue is "rotation-group bias," which occurs when the "government surveys people for four consecutive months, gives them eight months off and then surveys them for four more months."
The problem is that, over time, "the kinds of answers that people give — or the kinds of people who respond — change." The result is "people who aren’t working are less likely to report being available to work and having looked for a job in the previous four weeks, which is the definition of unemployment." Other limiting factors are the rise of caller ID, the decline of landline telephones and falling trust in the government. In general, it appears that changes in American telephone habits will require using new technologies and methods to collect information.
Researchers are grappling with the ethical dilemma of deceiving their subjects "in the name of science," reports Shirley S. Wang in The Wall Street Journal (8/26/14). The argument goes that there are some circumstances under which failing to tell subjects they are being studied is the only way to get accurate information. Geoff Pearson of University of Liverpool applied this premise in a ten-year study of "the behavior of rowdy soccer fans in the UK" after deciding "that just talking to them wasn’t good enough."
So, instead "he joined the fans at football matches to watch how the crowd behavior went from calm to rowdy … He wrote down his observations while huddled in the restroom, talked into a recorder by pretending it was a cellphone and jotted down copious notes after matches. Most important, he didn’t tell fans he was studying them." This flies in the face of "one of the main tenets of ethical research … that participants should be informed that they’re being studied and for what purpose."
Kypros Kypri of University of Newcastle meanwhile reports that "just asking heavy drinkers about their alcohol use sometimes changes their behavior." In this case, the research doubled as a kind of intervention, but without subjects knowing it. Kypros argues that the health benefits outweighed the ethical issues. In Geoff Pearson’s case, the undercover research helped improve policing at soccer matches. A key insight was that disorderly conduct was more likely when police "dressed in riot gear." Things went more smoothly "when police engaged in friendly conversation with the crowd."
Michael Harris thinks we need to limit our intake of technology just as we do fat and sugar (The Economist 8/16/14). Michael is author of The End of Absence, in which he argues that putting ourselves on a high-tech diet may be the last best hope, of those who still remember life before the web, to pass along healthier habits to the next generation. His premise is that when "our insatiable appetites — for information, stimulation, validation" — are instantly met, "the knowledge of what it is to be left unfulfilled may not."
In other words: "A culture of abundance devalues consumption. It fosters a vague feeling of dissatisfaction. Even the basic act of contemplation may suffer if idleness — when waiting for a bus, for example — is replaced with the easy entertainment offered by mobile phones. As with great music, silences are as much a part of the human experience as soaring crescendos. There is no inspiration without reflection." "Every technology will alienate you from some part of your life," Michael writes. "Your job is to notice."
Michael provides ample basis for his concern. As detailed in a Wall Street Journal review of his book (8/7/14): "Global Internet usage has expanded more than 500% in the last decade. YouTube users uploaded 100 hours of video for every minute of real time in 2013 … Google processes over 3.5 billion queries a day while each American owns, on average, four digital devices. A 2013 report found that Americans aged 18-64 spend an average of 3.2 hours a day on social networking sites."
"The world is coming back in the direction of paywalls, and of print," says John R. MacArthur, publisher of Harper’s Magazine, in a New York Times article by Ravi Somaiya (8/11/14). It’s a world that John never left, and that rests "on three pillars" — that the web is bad for writers, publishers and readers. It’s bad for writers because they "are too exhausted by the pace of an endless news cycle to write poised, reflective stories and are paid peanuts if they do."
It’s bad for publishers "who have lost advertising revenue to Google and Facebook and will never make enough from a free model to sustain great writing." It’s bad for readers "who cannot absorb information well on devices that buzz, flash and generally distract." Harper’s, which was founded in 1850 and is a non-profit funded by a foundation, "has been available online for a decade. But to read anything, you must subscribe to the physical magazine, too." John also discourages his writers from "publishing about their work elsewhere on the Internet," and eschews line extensions into conferences or videos.
"A magazine should be a magazine," he says. Clara Jeffery, a former Harper’s editor disagrees. "He does truly believe that technology is in opposition to good writing, financially, stylistically and journalistically," she says. "I don’t understand. Nothing is all good or all bad." John thinks it’s simple: "If you deliver stuff that nobody else is doing, in a world of increasing mediocrity or lack of standards, you’re providing something that’s very well edited, very enjoyable, very informative, very provocative, people will continue to pay for it."
So called "digital natives" aren’t as "inherently skilled with computers" in at least one important way, reports Christopher Chabris a The Wall Street Journal review of The Organized Mind by Daniel J. Levitin (8/16/14). Younger generations "have trouble distinguishing media outlets and websites that at least try to report news and facts objectively from those that are deliberately partisan or ideological. Even medical students aren’t good at telling high quality journals … from low quality ones."
Their challenge — like that of all human beings — is that "our minds mostly evolved long before the invention of reading and writing, let alone mass media," so "it stands to reason that a fine eye for evaluating the quality of sources must be learned, and even taught, rather than assumed to be part of our standard equipment." In general, our brain capacities "grew out of solutions to the problems that our ancestor species confronted when living in the natural world," which is "utterly unlike the world of information overload we now face."
The Organized Mind offers advice on how to cope, largely premised on offloading "personal management" tasks to increase focus on more important, or creative endeavors. Perhaps ironically, technology can help, for example the "calendar app that buzzes quietly 15 minutes before each appointment." Daniel offers other advice, such as a "junk drawer" to store one-of-a-kind items, or simply keeping important things in "plain sight" at all times. The premise is that an organized brain has more freedom to come up with better ideas.
Yawning doesn’t necessarily mean you are tired or bored. It could mean your brain is on fire, reports Jonathan Rockoff in The Wall Street Journal (8/19/14). First, forget the old idea that yawning was "the body’s way of correcting for a dearth of oxygen." It’s also a myth that our "tendency to yawn when other people yawn" is "an expression of empathy." The latest thinking is that it’s changes in brain chemistry that make us yawn, and that yawns "appear to have many different causes and to serve a variety of functions."
Yawning, it is now believed, is "a means to keep our brains alert in times of stress. Contagious yawning appears to have evolved in many animal species as a way to protect family and friends, by keeping everyone in the group vigilant … A leading hypothesis is that yawning plays an important role in keeping the brain at its cool, optimal working temperature. The brain is particularly sensitive to overheating, according to Andrew Gallup, an assistant professor of psychology at SUNY Oneonta."
This may explain why yawning sometimes overcomes public speakers and athletes before they perform, and why yawning is more common in summer months. Dr. Gallup bases his theory on experiments with rats, in which a subtle rise in brain temperature triggered a yawn, and then fell once they finished yawning. So, yawning may be kind of like air conditioning for the brain. Yawning after waking up may be tied to higher dopamine levels after rising — dopamine is linked to the "brain receptors that turn yawning on and off." Yawning when tired may be because brain temperature "is highest at night."
To get a sense of Unilever’s ethos, its CEO slept al fresco in the founder’s rooftop bed, reports The Economist (8/9/14). "William Lever, founder of what is now Unilever," slept nightly in the open, atop his mansion. Paul Polman, the company’s current CEO, only spent one night there, but the experience "helped persuade him, a year later, to launch the Sustainable Living Plan … his attempt to make Unilever the pre-eminent example of how to do capitalism responsibly, just as it had once been under Lever."
Mr. Lever "had pioneered the Victorian model of paternalistic business. At a time when disease and malnutrition were widespread in Britain, his products were marketed for their health benefits. His employees were decently housed in a purpose-built company town. Lever campaigned for state pensions for the elderly and even provided schooling, health care and good wages at palm-oil plantations in the Congo." The 21st century version of this vision is different, however, in that it focuses on changing consumer behavior.
After measuring "the carbon footprint of some 2,000 products," Unilever "found that on average 68% of greenhouse gas emissions … occurred only after they got into the hands of consumers." So, its goals include "getting 200m consumers by 2015 to take shorter showers." It is also reprising the launch of Lifebuoy soap in 19th century America with a "handwashing campaign" to reduce illnesses in modern-day India. (video) "The challenge now is to do the same with brands that do not have such obvious benefits as Lifebuoy."
Procter & Gamble hopes consumers will find a place in their closets for a new kind of laundry machine, reports Elizabeth Holmes in The Wall Street Journal (8/13/14). Developed in collaboration with Whirlpool, the machine is called Swash. It stands "more than four feet tall" and "uses gel-filled pods to help neutralize odors, remove wrinkles and restore a garment’s fit." Swash is not intended "to replace laundering or dry cleaning … just delay them." It is aimed at "a new laundry consumer: the re-wearer."
"Today, it’s smart," says Mike Grieff, P&G’s research and development director for new business creation and innovation. "Why would I wash something and go through the process if it’s really, really not that dirty?" Procter & Gamble has been developing against this insight for several years now, initially creating "a line of consumable products, including odor- and wrinkle-removing sprays." These were meant for "college students who didn’t want to do laundry."
The target now is "a higher-spending group of fashion-conscious people" — both men and women. Swash does not come cheap, retailing at $499, plus another $6.99 for the gel pods, each good for one use. Basically, the user hangs a garment inside, which is then sprayed"with a gel-like solution, hydrating the fibers to remove wrinkles and restore fit. Thermal heating technology dries the garment in 10 minutes," which consumers said was about how long it takes to shower. "It’s like a microwave for your clothes," says Mike.
Foxes are "more wily and flexible learners" than hedgehogs because of their childhoods, reports Alison Gopnik in The Wall Street Journal (8/20/14). The difference between the two animals — whose traits are often ascribed to people — was defined by the Greek poet Archilochus, who said: "The fox knows many things, but the hedgehog knows one big thing." (Archilochus apparently knew one thing but didn’t know how to make it rhyme.) The concept was later popularized by Isaiah Berlin, an Oxford philosopher.
Berlin ultimately decided his observation was oversimplified, but recently, psychologist Philip Tetlock "studied expert political predictions and found that foxy, flexible, pluralistic experts were much more accurate than experts with one big hedgehog idea … In tribute to this finding, the statistics whiz Nate Silver chose a fox as the logo for his website." Biologist David MacDonald, meanwhile, suggests the fox-hedgehog difference is, well, biological: "Hedgehogs develop their spines — that one big thing — almost as soon as they are born."
This makes them independent within six weeks, compared to fox cubs, who "are dependent for six months." Where hedgehog dads take off after mating, fox dads "help bring food to the babies." Not only that, they bring still-alive prey "and the babies play at hunting them." They "practice and develop the flexible hunting skills and wily intelligence that serve them so well later on." So, where hedgehogs quickly adapt to one environment, a combination of parental protection and play teaches them to "cope with a changing world."
Contrary to popular belief, the Oakland A’s success is not because of homegrown players, reports Jared Diamond in The Wall Street Journal (8/20/14). In fact, the "A’s have used just four homegrown players in 2014, the fewest in baseball by a wide margin." By comparison, the languishing New York Mets "have used 21, the third-highest total." Interestingly, both teams have "similar payrolls" — the Mets actually spend a bit more ($85 million) than the A’s ($82.3 million). The difference may be explained by "two central tenets."
The first tenet is general manager Billy Beane‘s legendary "ability to identify (or luck into) cheap, productive players whom his competitors don’t want … No team does a better job of rummaging through everybody else’s attic and discovering gold." The other is Billy’s "commitment to financial flexibility," which enables him to "constantly tinker with his assets." Unlike the Mets, he doesn’t allocate 23.5% of his budget to a single player (David Wright), much less 49% to three players (David Wright, Curtis Granderson and Bartolo Colon).
The A’s highest-paid player at season’s start, Yoenis Cespedes, accounted for just 12.8% of the team’s budget, and its three highest-paid (Yoenis Cespedes, Jim Johnson and Scott Kazmir), "made up 35.8% of the payroll." Yoenis has since been traded. Granted, it may be easier for a smaller-market team like the A’s to avoid employing big stars, but the bottom line is "the A’s are within striking distance of their third consecutive American League West crown," and 8th since 2000, while the Mets haven’t made the playoffs in eight years.
The newest Westside Market is opening on New York’s East Side, changing everything but its name, reports Kia Gregory in The New York Times (8/19/14). The store is Ioannis and Maria Zoitas’ sixth — four are in Manhattan and one — called Maywood’s Marketplace – is in Maywood, New Jersey. The new East Side store will keep the Westside name, but its product offerings will match neighborhood tastes. "One block makes a difference in this city," says Jimmy Beleses, who will run the store with his brother-in-law, George Zoitas.
"Every neighborhood has its food culture, particularly as food has evolved from necessity to a form of identity. What sells uptown, like rice cakes, barely budges downtown. For truffles, it is vice versa … A 64-ounce-size laundry detergent does not sell to downtowners; they want 32 ounces … Uptown will buy fried chicken. Downtown, it is chicken baked with bread crumbs. Uptown, Amy’s canned soups are popular. Downtown, homemade kale chips sell better. Uptown, Kobe brisket. Downtown, seltzer barely sells."
"Our dairy buying, our produce buying, our grocery buying, our fish buying is all store level," says George. Ioannis — known to friends as ‘Big John’ — designs the store interiors "with pencils, paper, a ruler and a few erasers." He started out as a stock boy at Westside’s first location at Broadway and 110th, buying and re-naming it 37 years ago. Back then the store carried mostly just the basics, but today, says Big John, "People … eat more quality and they will spend extra for it."
A decade’s worth of search data reveals America’s cultural and consumer divide, reports David Leonhardt in The New York Times (8/19/14). The search data — provided by Google – was cross-referenced with a Times "analysis of every county in the country to determine which were the toughest places to live." That analysis was "based on an index of six factors including income, education and life expectancy," and found that those living in "large areas of Kentucky, Arkansas, Maine, New Mexico" tend to lead the toughest lives.
The Google analysis meanwhile found that in those regions, the most common search topics included "weight-loss diets, guns, video games and religion. The dark side of religion is of special interest: Antichrist has the second-highest correlation with the hardest places, and searches containing ‘hell’ and ‘rapture’ also make the top 10 … In the easiest places to live, the Canon Elph and other digital cameras dominate the top of the correlation list." Apparently, "where life seems good," people "want to record their lives in images."
These areas include "Nebraska, Iowa, Wyoming, and much of the large metropolitan areas of the Northeast and West Coast." "Holiday greetings" is a popular term in the easier places, perhaps because they are populated by people who have moved away from relatives or childhood friends — and that "Merry Christmas" is a more popular wish in harder places. "The phrase ‘pull-out’ is also relatively popular in the easiest places. It presumably refers to either a kind of sofa or a kind of birth control."
"You can’t have world-changing discoveries without allowing apparently pointless research," writes Steven Poole in a Wall Street Journal review of This Is Improbable Too, by Marc Abrahams (8/9/14). This is "not only because the latter sometimes turns into (or at least inspires) the former, but because there’s no way to tell what will be important before the results are in." You may recognize the book’s author as the founder of the Ig Nobel prizes for "improbable research."
The Ig Nobel prize is best known for silliness, such as the finding "that people would be able to run across the surface of a pond but only on the moon," or research that "confirmed scientifically that the more alcohol you consume, the more attractive you are." The premise is that such research is "not entirely stupid and humiliating" but rather "first makes you laugh and then makes you think." At a minimum, it is "a celebration of the small truth or pleasing resolution of the everyday conundrum."
For example, there’s been research to explain why "sawing horizontally with a knife" works better than "pressing straight down with the blade." The book also offers an "amusing counterweight to the cultural picture of scientists as heroic figures" and takes aim at the "mania for measurement," such as one scientist’s 35 year study "of how fast his fingernails grew." And then there is the quirky but useful, such as a study of the "size of the CEO’s signature on SEC filings"; the bigger the signature, the worse the company’s performance.
"It’s an absolute myth that you can send an algorithm over raw data and have insights pop up," says Jeff Heer of The University of Washington in a New York Times article by Steve Lohr (8/18/14). This minor detail is often lost in the excitement over Big Data’s potential. The problem is that "data scientists … spend from 50 percent to 80 percent of their time mired" in the mundane task "of collecting and preparing unruly data, before it can be explored for useful nuggets."
This is because Big Data, by definition, is drawn from multiple sources — each of which may arrive in its own special format. Compounding the challenge "is the ambiguity of human language," meaning that different sources may describe the same thing using different words. A number of start-ups, such as ClearStory, are trying to solve for such problems by developing software "that recognizes many data sources, pulls them together and presents the results visually as charts, graphics, or data-filled maps."
Other start-ups on a similar path include Trifacta, which uses "machine-learning technology to find, present and suggest types of data that might be useful for a data scientist" and Paxata, whose focus is on "finding, cleaning and blending data so that it is ready to be analyzed." The vision is that such tools will make data analysis accessible to "a wider market of business users beyond data masters," not unlike the way "spreadsheets made financial math and simple modeling accessible to millions of non-experts."