Carnegie Mellon University’s “assistive technologies” project is what would happen if a more socially minded MacGyver got his hands on a 3D printer.
Created by a pioneering research team at CMU’s School of Computer Science, the project’s goal is to build life-changing prosthetics for people in need — and to do it more cheaply than would have been possible before.
Recently, the team used its considerable expertise to create a prosthesis allowing a would-be cello player with only one arm to play his instrument of choice. Thanks to the tool they built, the Pittsburgh-based budding musician was able to play at his grade school recital.
Read Article: https://www.digitaltrends.com/cool-tech/carnegie-mellon-3d-printed-prosthetics/#ixzz4gwcUDuZg
In order for AI and big data to be successful, companies must combine them with business expertise and insight – making it something the C-suite can’t ignore.
Read Article: http://www.information-age.com/whats-key-big-data-ai-successful-123464248/
In a forecasting exercise, Gordon Earle Moore, co-founder of Intel, plotted data on the number of components—transistors, resistors, and capacitors—in chips made from 1959 to 1965. He saw an approximate straight line on log paper (see Figure 1). Extrapolating the line, he speculated that the number of components would grow from 26 in 1965 to 216 in 1975, doubling every year. His 1965–1975 forecast came true. In 1975, with more data, he revised the estimate of the doubling period to two years. In those days, doubling components also doubled chip speed because the greater number of components could perform more powerful operations and smaller circuits allowed faster clock speeds. Later, Moore's Intel colleague David House claimed the doubling time for speed should be taken as 18 months because of increasing clock speed, whereas Moore maintained that the doubling time for components was 24 months. But clock speed stabilized around 2000 because faster speeds caused more heat dissipation than chips could withstand. Since then, the faster speeds are achieved with multi-core chips at the same clock frequency
Read Article: cacm.acm.org/magazines/2017/1/211094-exponential-laws-of-computing-growth/fulltext
wiggly, ravenous caterpillar — one that doesn’t limit its diet to naturally grown objects — can biodegrade plastic bags, a material infamous for the amount of time it takes to decompose, a new study finds.
Read Article: https://www.livescience.com/58810-caterpillar-biodegrades-plastic-bags.html
What is the cloud? Where is the cloud? Are we in the cloud now? These are all questions you’ve probably heard or even asked yourself. The term “cloud computing” is everywhere.
In the simplest terms, cloud computing means storing and accessing data and programs over the Internet instead of your computer’s hard drive. The cloud is just a metaphor for the Internet. It goes back to the days of flowcharts and presentations that would represent the gigantic server-farm infrastructure of the Internet as nothing but a puffy, white cumulus cloud, accepting connections and doling out information as it floats.
Read Article: https://www.pcmag.com/article2/0,2817,2372163,00.asp
by Bradley Mitchell
Updated March 19, 2017The terms “information technology” and “IT” are widely used in business and the field of computing. People use the terms generically when referring to various kinds of computer-related work, which sometimes confuses their meaning.
What Is Information Technology?A 1958 article in Harvard Business Review referred to information technology as consisting of three basic parts: computational data processing, decision support, and business software.
This time period marked the beginning of IT as an officially defined area of business; in fact, this article probably coined the term.
Over the ensuring decades, many corporations created so-called “IT departments” to manage the computer technologies related to their business. Whatever these departments worked on became the de facto definition of Information Technology, one that has evolved over time. Today, IT departments have responsibility in areas like
The future is now, or at least it is coming soon. Today's technological developments are looking very much like what once was the domain of science fiction. Maybe we don't have domed cities and flying cars, but we do have buildings that reach to the heavens, and drones that soon could deliver our packages. Who needs a flying car when the self-driving car -- though still on the ground -- is just down the road?
The media often notes the comparisons of technological advances to science fiction, and the go-to examples cited are often Star Trek, The Jetsons and various 1980s and 90s cyberpunk novels and similar dark fiction. In many cases, this is because many tech advances actually are fairly easy comparisons to what those works of fictions presented.
On the other hand, they tend to be really lazy comparisons. Every advance in holographic technology should not immediately evoke Star Trek's holodeck, and every servant-styled robot should not immediately be compared to Rosie, the maid-robot in The Jetsons
Read Article: http://www.technewsworld.com/story/84479.html
About Oliver Briscoe
Oliver Briscoe is a 20+ year veteran of the Informational Technology field. He understands his first principals and loves teaching others.