Best TV-on-PC device?
That does indeed sound pretty messy.
So not all providers broadcast the same channels in the same resolution?
Is there any reasoning why you would upscale a 720p source into 1080i? If some, but not all, of the shows are in 1080i I could understand. But if the source ALWAYS is 720p? Is it supposed to look better than if the TVs themselves would scale it?
_________________
sliqua-jcooter
Veteran
Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA
So not all providers broadcast the same channels in the same resolution?
Is there any reasoning why you would upscale a 720p source into 1080i? If some, but not all, of the shows are in 1080i I could understand. But if the source ALWAYS is 720p? Is it supposed to look better than if the TVs themselves would scale it?
Because if someone goes out and buys a 1080p TV, and turns it on and sees "720p" in the corner of their screen they flip out.
_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.
There will *never* be enough bandwidth wirelessly to stream HD video from "the cloud". It isn't mathematically possible. Either someone needs to develop a new compression codec that takes it down to 1/4 the size it is now, or we need to figure out a way to transmit signals higher than 10ghz - anything short of that isn't going to happen.
A majority of internet users in the U.S. are absolutely capable of that.
An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.
Using bittorrent-type technology in cities like New York, where content comes in from multiple sources, could make the density of cities like New York a benefit instead of a drawback. Instead of using one-to-one communication, like most people here would think, and instead of using one-to-many, like what you're thinking, it would make a whole lot more sense to use a many-to-many technologies, so that once the information is downloaded by one device it immediately starts sharing with multiple devices. Rather than streaming from one location, you minimize the distance that traffic has to flow, and utilize upload capacity in addition to download bandwidth. There is a lot of unused upload capacity.
There will absolutely be enough bandwidth available to do widespread wireless HD streaming in the future, but the reasons for that are very complicated and I'm not going to get into them unless you decide to press further about why.
sliqua-jcooter
Veteran
Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA
A majority of internet users in the U.S. are absolutely capable of that.
An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.
Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.
BitTorrent doesn't help the problem, it makes it dramatically worse. Bittorrent was specifically designed to circumvent the TCP protections that let network administrators control traffic inside their network. It gets you faster transfer speeds, but does so at the expense of the network as a whole. Traffic engineers hate bittorrent for this reason.
With advancements in encoding, and the FCC's huge narrowbanding push - maybe. But it's not possible right now, and it won't be until dramatic changes are made to the mobile networks. LTE is literally the first step along that path, and it's just now starting to get a foothold - so I wouldn't hold my breath on this one.
_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.
A majority of internet users in the U.S. are absolutely capable of that.
An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.
Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.
BitTorrent doesn't help the problem, it makes it dramatically worse. Bittorrent was specifically designed to circumvent the TCP protections that let network administrators control traffic inside their network. It gets you faster transfer speeds, but does so at the expense of the network as a whole. Traffic engineers hate bittorrent for this reason.
I said Bittorrent-type technologies. One thing that engineers have been very good at is to design technology so that weaknesses become strengths. Imagine a system in which a unicast stream is streamed to one device on a node and that device immediately begins notifying other devices of the availability of the content. The likelihood that every person in a 100-apartment building would be watching a different channel simultaneously is infinitesimal, so you can immediately throw out the notion that you'd require 200 unicast streams.
First, using multicasting makes sense, but you've already mentioned that, and that's already being used. So consider what else we'd need to support widespread wired on demand IP-HDTV.
Second, within homes, make all DVR devices network-enabled. Give every box a GigE or a 400Gbps ethernet port, and use 60GHz wireless(like 802.11ad), which won't easily penetrate walls, but which is already being used in testing for new wireless HDTV devices. This will have a huge benefit over 2.4GHz and 5GHz devices, because instead of competing with your neighbors for bandwidth your wireless networks would be mostly isolated.
Third, have all DVR devices on the same network nodes communicate with each other to share data. Use upload capacity from DVR devices on the same nodes to reduce bandwidth costs. If desired, there are a million ways to prevent the traffic from ever going upstream beyond the local node.
Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.
Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.
sliqua-jcooter
Veteran
Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA
Yes - but the likelihood that 200 televisions will be watching more than 10 different channels, when 800 channels are being offered, is quite high - especially during peak hours. And that's assuming that you have 100% of available bandwidth allocated to TV. There's another issue - live TV services don't oversubscribe as effectively as Internet access does. If you're downloading a file at the same time 10 other people are downloading a file in your apartment building, the available bandwidth gets distributed more-or-less evenly between them, and as a result everyone gets speeds in the range of 10-15mbps. It's not the full rate that they pay for, but generally everyone is happy because they get their file in a reasonable amount of time. You can't do that with a live streaming service - once the allocated/available bandwidth has been used, no more streams can be added. Try explaining that to someone who, all of a sudden, can't watch TV because too many other people are.
You would have to design the network to provide not just a reasonable level of service, but a guaranteed level of service - even during peak times, which means over-sizing links to handle a peak usage somewhere around 70-80%, instead of being able to oversubscribe the link to a contention ratio of ~10:1, which is common. The costs associated with this added infrastructure would drive the cost up to be several times that of existing cable service.
No - you'd probably need 20-30 multicast streams, but that's still more bandwidth than is available. And P2P doesn't do anything to help that.
And the benefit is what, exactly? Other than being able to stream recorded content to other STBs/TVs (which is already possible with MoCA)
Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.
Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.
Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.
_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.
Yes - but the likelihood that 200 televisions will be watching more than 10 different channels, when 800 channels are being offered, is quite high - especially during peak hours. And that's assuming that you have 100% of available bandwidth allocated to TV. There's another issue - live TV services don't oversubscribe as effectively as Internet access does. If you're downloading a file at the same time 10 other people are downloading a file in your apartment building, the available bandwidth gets distributed more-or-less evenly between them, and as a result everyone gets speeds in the range of 10-15mbps. It's not the full rate that they pay for, but generally everyone is happy because they get their file in a reasonable amount of time. You can't do that with a live streaming service - once the allocated/available bandwidth has been used, no more streams can be added. Try explaining that to someone who, all of a sudden, can't watch TV because too many other people are.
You would have to design the network to provide not just a reasonable level of service, but a guaranteed level of service - even during peak times, which means over-sizing links to handle a peak usage somewhere around 70-80%, instead of being able to oversubscribe the link to a contention ratio of ~10:1, which is common. The costs associated with this added infrastructure would drive the cost up to be several times that of existing cable service.
And the benefit is what, exactly? Other than being able to stream recorded content to other STBs/TVs (which is already possible with MoCA)
Being able to use unused bandwidth for other services would be a major benefit.
Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.
Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.
Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.
Nothing I mentioned would have any effect on detecting who was watching, when or where. Network-attached DVRs could report statistics as well as any other device can. They could be accessible over the network by the support people at the ISP.
As for security and privacy issues, what security and privacy issues do you believe that there would be?
sliqua-jcooter
Veteran
Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA
Because that's how TV is delivered, and broadcast networks have a vested interest in keeping it that way. If everything switched to a VoD model, the problem effectively goes away by itself and this whole conversation is moot.
Being able to stream recorded content to other televisions is the benefit.
Being able to use unused bandwidth for other services would be a major benefit.
Like I said, this is already being done. http://www.directv.com/DTVAPP/content/t ... /wholehome http://ww2.cox.com/residential/northern ... ackage.cox http://www.comcast.com/anyroomdvr/?SCRedirect=true
Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.
Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.
Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.
Nothing I mentioned would have any effect on detecting who was watching, when or where. Network-attached DVRs could report statistics as well as any other device can. They could be accessible over the network by the support people at the ISP.
As for security and privacy issues, what security and privacy issues do you believe that there would be?[/quote]
Except DVRs that phone home have been the subject of major privacy scrutiny by various groups. Those are the privacy concerns - and the security concern is that a device behind the user's network that talks to other devices behind other users' networks. It very efficiently bypasses network security devices at the border (which is the only place home users have security) and would be an extremely valuable attack vector.
_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.
Only a fraction of TV is actually really live. If broadcasters wanted to, they could deliver encrypted content in advance and send keys in real time as the content is intended to be viewed. But most people I know, myself included, skip broadcasts entirely and get shows a day later via DVR or bit torrent. I am a busy individual and if I'm going to watch something at a date and time determined by someone else, it better be a Shakespearean masterpiece.
Although the live nature of TV should actually make it easier to move the data both wirelessly and on cable networks - cable obviously already has the bandwidth, it's just moving the data in QAM format as opposed to a packet format; multicasting wirelessly is obviously possible since OTA stations do it, in some cases at remarkably high bit rates. Just need new hardware, new protocols. In the meantime, I'll keep bit torrenting shows that as a DirecTV subscriber I'm already paying for anyway. It only takes a few minutes to download a 1GB 40-minute long show.
DirecTV probably broadcasts under 100 channels that are actually network television that anyone cares about. During prime time, probably 80% of users are watching the same 5 or so channels. I tried an OTA antenna about a year ago on the Upper West Side of Manhattan and got 72 channels.