Over the past few months, I’ve been helping my mother downsize her house. In the process, I’ve brought about five TV sets to the town’s recycling station (a.k.a, the dump). One TV in particular had been around since before I was born. It weighed about 50 pounds, had wood paneling on the front, dials to change channels, and of course, bunny ears. It almost looked as though you could turn this TV on, and Leave it to Beaver would magically appear on the screen.
What I noticed when dropping off my mother’s TV sets was that other people were throwing away fairly new TV sets—mostly flat screen TVs. Why? Smart TVs with built-in WiFi are replacing the “dumb” TVs with that cannot be adapted to the newer technology.
If the TV toss-a-ways at the dump are any indication, Smart TVs have virtually eliminated the practice of holding onto TVs for 20-30 years (Maybe that’s just my mother’s practice.) Why? Smart TVs require software upgrades, and now, products like Google’s new set top box will inevitably require upgrades as well. This means spending more money on TV sets, more frequently.
There is evidence that the Smart TV market is growing fast. In fact, research done by Parks Associates indicates that Smart TV adoption increased by 31% in one year. Has it been a predictable change? Yes. The question is, what does this mean for desktop computers? Will they adapt, or will it cut into the business and eventually be phased out? Think about it: If you can check your email or Facebook page on your TV from your couch, using a wireless keyboard, and then switch over to watching TV, what would you use a desktop computer for? This idea is similar to using video cameras. With so many digital cameras and smart phones that have video capabilities, there is no point in having a separate video camera. (Shh….don’t tell GoPro.)
What’s great about upgrading to a Smart TV? They’re easier to use, and you can still watch Leave it to Beaver, or simply stay connected to everything and everyone. No bunny ears needed.