Page 3 of 3 [ 42 posts ]  Go to page Previous  1, 2, 3

kra17
Veteran
Veteran

User avatar

Joined: 14 Feb 2010
Age: 30
Gender: Male
Posts: 594
Location: Sweden

01 Oct 2012, 9:56 pm

That does indeed sound pretty messy.
So not all providers broadcast the same channels in the same resolution?
Is there any reasoning why you would upscale a 720p source into 1080i? If some, but not all, of the shows are in 1080i I could understand. But if the source ALWAYS is 720p? Is it supposed to look better than if the TVs themselves would scale it?


_________________
:bigsmurf: :bigsmurf:


sliqua-jcooter
Veteran
Veteran

User avatar

Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA

01 Oct 2012, 9:58 pm

kra17 wrote:
That does indeed sound pretty messy.
So not all providers broadcast the same channels in the same resolution?
Is there any reasoning why you would upscale a 720p source into 1080i? If some, but not all, of the shows are in 1080i I could understand. But if the source ALWAYS is 720p? Is it supposed to look better than if the TVs themselves would scale it?


Because if someone goes out and buys a 1080p TV, and turns it on and sees "720p" in the corner of their screen they flip out.


_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.


matt
Veteran
Veteran

User avatar

Joined: 20 Dec 2007
Gender: Male
Posts: 916

08 Oct 2012, 4:45 pm

sliqua-jcooter wrote:
NYC is exactly the worst place to do that. Running fiber in a dense urban area is *extremely* difficult. There is not enough backbone bandwidth to make anything work. Existing coax has the physical bandwidth to transmit streams - that's not the issue. IP has overhead - it needs to be switched and routed to an address - and that complexity adds an extreme amount of cost. Multicast streaming TV can be done - Verizon is doing it right now - but it's not going to ever work the way you want it to.

There will *never* be enough bandwidth wirelessly to stream HD video from "the cloud". It isn't mathematically possible. Either someone needs to develop a new compression codec that takes it down to 1/4 the size it is now, or we need to figure out a way to transmit signals higher than 10ghz - anything short of that isn't going to happen.
With HEVC a 1080p stream would be a little more than 5Mbps.

A majority of internet users in the U.S. are absolutely capable of that.

An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.

Using bittorrent-type technology in cities like New York, where content comes in from multiple sources, could make the density of cities like New York a benefit instead of a drawback. Instead of using one-to-one communication, like most people here would think, and instead of using one-to-many, like what you're thinking, it would make a whole lot more sense to use a many-to-many technologies, so that once the information is downloaded by one device it immediately starts sharing with multiple devices. Rather than streaming from one location, you minimize the distance that traffic has to flow, and utilize upload capacity in addition to download bandwidth. There is a lot of unused upload capacity.

There will absolutely be enough bandwidth available to do widespread wireless HD streaming in the future, but the reasons for that are very complicated and I'm not going to get into them unless you decide to press further about why.



sliqua-jcooter
Veteran
Veteran

User avatar

Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA

08 Oct 2012, 4:55 pm

matt wrote:
With HEVC a 1080p stream would be a little more than 5Mbps.

A majority of internet users in the U.S. are absolutely capable of that.

An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.


Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.

Quote:
Using bittorrent-type technology in cities like New York, where content comes in from multiple sources, could make the density of cities like New York a benefit instead of a drawback. Instead of using one-to-one communication, like most people here would think, and instead of using one-to-many, like what you're thinking, it would make a whole lot more sense to use a many-to-many technologies, so that once the information is downloaded by one device it immediately starts sharing with multiple devices. Rather than streaming from one location, you minimize the distance that traffic has to flow, and utilize upload capacity in addition to download bandwidth. There is a lot of unused upload capacity.


BitTorrent doesn't help the problem, it makes it dramatically worse. Bittorrent was specifically designed to circumvent the TCP protections that let network administrators control traffic inside their network. It gets you faster transfer speeds, but does so at the expense of the network as a whole. Traffic engineers hate bittorrent for this reason.

Quote:
There will absolutely be enough bandwidth available to do widespread wireless HD streaming in the future, but the reasons for that are very complicated and I'm not going to get into them unless you decide to press further about why.


With advancements in encoding, and the FCC's huge narrowbanding push - maybe. But it's not possible right now, and it won't be until dramatic changes are made to the mobile networks. LTE is literally the first step along that path, and it's just now starting to get a foothold - so I wouldn't hold my breath on this one.


_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.


matt
Veteran
Veteran

User avatar

Joined: 20 Dec 2007
Gender: Male
Posts: 916

08 Oct 2012, 6:25 pm

sliqua-jcooter wrote:
matt wrote:
With HEVC a 1080p stream would be a little more than 5Mbps.

A majority of internet users in the U.S. are absolutely capable of that.

An apartment building that had 100 apartments would need ~1Gbps + overhead in order for each apartment to have two HD streams. That's completely doable.


Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.
Of course. That's the cost efficiency of packet switching. So you design around it. Take into account things like the fact that the likelihood that if you have a building with 200 different televisions and they're all watching something the chances that every one of them is watching something different is almost zero. Many will be watching the same things, and many will be watching shows that have been previously recorded.

sliqua-jcooter wrote:
Quote:
Using bittorrent-type technology in cities like New York, where content comes in from multiple sources, could make the density of cities like New York a benefit instead of a drawback. Instead of using one-to-one communication, like most people here would think, and instead of using one-to-many, like what you're thinking, it would make a whole lot more sense to use a many-to-many technologies, so that once the information is downloaded by one device it immediately starts sharing with multiple devices. Rather than streaming from one location, you minimize the distance that traffic has to flow, and utilize upload capacity in addition to download bandwidth. There is a lot of unused upload capacity.


BitTorrent doesn't help the problem, it makes it dramatically worse. Bittorrent was specifically designed to circumvent the TCP protections that let network administrators control traffic inside their network. It gets you faster transfer speeds, but does so at the expense of the network as a whole. Traffic engineers hate bittorrent for this reason.
I didn't say Bittorrent specifically.

I said Bittorrent-type technologies. One thing that engineers have been very good at is to design technology so that weaknesses become strengths. Imagine a system in which a unicast stream is streamed to one device on a node and that device immediately begins notifying other devices of the availability of the content. The likelihood that every person in a 100-apartment building would be watching a different channel simultaneously is infinitesimal, so you can immediately throw out the notion that you'd require 200 unicast streams.

First, using multicasting makes sense, but you've already mentioned that, and that's already being used. So consider what else we'd need to support widespread wired on demand IP-HDTV.

Second, within homes, make all DVR devices network-enabled. Give every box a GigE or a 400Gbps ethernet port, and use 60GHz wireless(like 802.11ad), which won't easily penetrate walls, but which is already being used in testing for new wireless HDTV devices. This will have a huge benefit over 2.4GHz and 5GHz devices, because instead of competing with your neighbors for bandwidth your wireless networks would be mostly isolated.

Third, have all DVR devices on the same network nodes communicate with each other to share data. Use upload capacity from DVR devices on the same nodes to reduce bandwidth costs. If desired, there are a million ways to prevent the traffic from ever going upstream beyond the local node.

Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.

Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.



sliqua-jcooter
Veteran
Veteran

User avatar

Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA

08 Oct 2012, 7:21 pm

matt wrote:
sliqua-jcooter wrote:
Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.
Of course. That's the cost efficiency of packet switching. So you design around it. Take into account things like the fact that the likelihood that if you have a building with 200 different televisions and they're all watching something the chances that every one of them is watching something different is almost zero. Many will be watching the same things, and many will be watching shows that have been previously recorded.


Yes - but the likelihood that 200 televisions will be watching more than 10 different channels, when 800 channels are being offered, is quite high - especially during peak hours. And that's assuming that you have 100% of available bandwidth allocated to TV. There's another issue - live TV services don't oversubscribe as effectively as Internet access does. If you're downloading a file at the same time 10 other people are downloading a file in your apartment building, the available bandwidth gets distributed more-or-less evenly between them, and as a result everyone gets speeds in the range of 10-15mbps. It's not the full rate that they pay for, but generally everyone is happy because they get their file in a reasonable amount of time. You can't do that with a live streaming service - once the allocated/available bandwidth has been used, no more streams can be added. Try explaining that to someone who, all of a sudden, can't watch TV because too many other people are.

You would have to design the network to provide not just a reasonable level of service, but a guaranteed level of service - even during peak times, which means over-sizing links to handle a peak usage somewhere around 70-80%, instead of being able to oversubscribe the link to a contention ratio of ~10:1, which is common. The costs associated with this added infrastructure would drive the cost up to be several times that of existing cable service.

Quote:
I said Bittorrent-type technologies. One thing that engineers have been very good at is to design technology so that weaknesses become strengths. Imagine a system in which a unicast stream is streamed to one device on a node and that device immediately begins notifying other devices of the availability of the content. The likelihood that every person in a 100-apartment building would be watching a different channel simultaneously is infinitesimal, so you can immediately throw out the notion that you'd require 200 unicast streams.


No - you'd probably need 20-30 multicast streams, but that's still more bandwidth than is available. And P2P doesn't do anything to help that.

Quote:
Second, within homes, make all DVR devices network-enabled. Give every box a GigE or a 400Gbps ethernet port, and use 60GHz wireless(like 802.11ad), which won't easily penetrate walls, but which is already being used in testing for new wireless HDTV devices. This will have a huge benefit over 2.4GHz and 5GHz devices, because instead of competing with your neighbors for bandwidth your wireless networks would be mostly isolated.


And the benefit is what, exactly? Other than being able to stream recorded content to other STBs/TVs (which is already possible with MoCA)

Quote:
Third, have all DVR devices on the same network nodes communicate with each other to share data. Use upload capacity from DVR devices on the same nodes to reduce bandwidth costs. If desired, there are a million ways to prevent the traffic from ever going upstream beyond the local node.

Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.

Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.


Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.


_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.


matt
Veteran
Veteran

User avatar

Joined: 20 Dec 2007
Gender: Male
Posts: 916

08 Oct 2012, 8:31 pm

sliqua-jcooter wrote:
matt wrote:
sliqua-jcooter wrote:
Most apartment buildings get fed with 100mbps, or an OC-3 - depending on the network architecture. FiOS and brand-new deployments can be fed with a gig link, but the bottleneck just gets pushed further up the chain. There's *always* oversubscription somewhere.
Of course. That's the cost efficiency of packet switching. So you design around it. Take into account things like the fact that the likelihood that if you have a building with 200 different televisions and they're all watching something the chances that every one of them is watching something different is almost zero. Many will be watching the same things, and many will be watching shows that have been previously recorded.


Yes - but the likelihood that 200 televisions will be watching more than 10 different channels, when 800 channels are being offered, is quite high - especially during peak hours. And that's assuming that you have 100% of available bandwidth allocated to TV. There's another issue - live TV services don't oversubscribe as effectively as Internet access does. If you're downloading a file at the same time 10 other people are downloading a file in your apartment building, the available bandwidth gets distributed more-or-less evenly between them, and as a result everyone gets speeds in the range of 10-15mbps. It's not the full rate that they pay for, but generally everyone is happy because they get their file in a reasonable amount of time. You can't do that with a live streaming service - once the allocated/available bandwidth has been used, no more streams can be added. Try explaining that to someone who, all of a sudden, can't watch TV because too many other people are.

You would have to design the network to provide not just a reasonable level of service, but a guaranteed level of service - even during peak times, which means over-sizing links to handle a peak usage somewhere around 70-80%, instead of being able to oversubscribe the link to a contention ratio of ~10:1, which is common. The costs associated with this added infrastructure would drive the cost up to be several times that of existing cable service.
Why would you insist on live television services? What percentages of television services are live-to-air? Even in the instances in which there is a major event, the vast majority of television is prerecorded. On the heaviest news days there might be 30 different live television news feeds, but on the days that major news happens most people tune to the most major ones.

sliqua-jcooter wrote:
Quote:
Second, within homes, make all DVR devices network-enabled. Give every box a GigE or a 400Gbps ethernet port, and use 60GHz wireless(like 802.11ad), which won't easily penetrate walls, but which is already being used in testing for new wireless HDTV devices. This will have a huge benefit over 2.4GHz and 5GHz devices, because instead of competing with your neighbors for bandwidth your wireless networks would be mostly isolated.


And the benefit is what, exactly? Other than being able to stream recorded content to other STBs/TVs (which is already possible with MoCA)
Being able to stream recorded content to other televisions is the benefit.

Being able to use unused bandwidth for other services would be a major benefit.

sliqua-jcooter wrote:
Quote:
Third, have all DVR devices on the same network nodes communicate with each other to share data. Use upload capacity from DVR devices on the same nodes to reduce bandwidth costs. If desired, there are a million ways to prevent the traffic from ever going upstream beyond the local node.

Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.

Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.


Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.
It does no such thing.

Nothing I mentioned would have any effect on detecting who was watching, when or where. Network-attached DVRs could report statistics as well as any other device can. They could be accessible over the network by the support people at the ISP.

As for security and privacy issues, what security and privacy issues do you believe that there would be?



sliqua-jcooter
Veteran
Veteran

User avatar

Joined: 25 Jan 2010
Age: 36
Gender: Male
Posts: 1,488
Location: Burke, Virginia, USA

09 Oct 2012, 5:46 am

matt wrote:
Why would you insist on live television services? What percentages of television services are live-to-air? Even in the instances in which there is a major event, the vast majority of television is prerecorded. On the heaviest news days there might be 30 different live television news feeds, but on the days that major news happens most people tune to the most major ones.


Because that's how TV is delivered, and broadcast networks have a vested interest in keeping it that way. If everything switched to a VoD model, the problem effectively goes away by itself and this whole conversation is moot.

Quote:
sliqua-jcooter wrote:
And the benefit is what, exactly? Other than being able to stream recorded content to other STBs/TVs (which is already possible with MoCA)


Being able to stream recorded content to other televisions is the benefit.

Being able to use unused bandwidth for other services would be a major benefit.


Like I said, this is already being done. http://www.directv.com/DTVAPP/content/t ... /wholehome http://ww2.cox.com/residential/northern ... ackage.cox http://www.comcast.com/anyroomdvr/?SCRedirect=true

sliqua-jcooter wrote:
Quote:
Third, have all DVR devices on the same network nodes communicate with each other to share data. Use upload capacity from DVR devices on the same nodes to reduce bandwidth costs. If desired, there are a million ways to prevent the traffic from ever going upstream beyond the local node.

Fourth, make all DVR devices download and cache content when they're not being used. I'd expect that over 90% of televisions have at least eight hours per day when they are not being used, and most probably go the vast majority of the day without being used. With storage being really cheap, they could easily download, store and serve to neighbors. And the vast majority of television is not live, so if you want to free up bandwidth, it's possible to release content to the DVRs in advance of the time that it's made available.

Fifth, if the DVRs on your local node keep track of what's been downloaded in the past, they can make assumptions about what DVRs on that node would want to download in the future. Releasing video at staggered intervals throughout the day, and making use of the times when bandwidth demand is at the lowest, would help alleviate network congestion.


Obvious security and privacy issues aside, content producers will never allow this because it destroys any mechanism they have for tracking how many people watch a show - which is what directly determines advertising interest, and over the long term enables them to stay in business. The only reason content providers are on board for Internet-based technology is they've realized that the bi-directional nature of the technology lets them dig deeper into demographics and psychographics.
It does no such thing.

Nothing I mentioned would have any effect on detecting who was watching, when or where. Network-attached DVRs could report statistics as well as any other device can. They could be accessible over the network by the support people at the ISP.

As for security and privacy issues, what security and privacy issues do you believe that there would be?[/quote]

Except DVRs that phone home have been the subject of major privacy scrutiny by various groups. Those are the privacy concerns - and the security concern is that a device behind the user's network that talks to other devices behind other users' networks. It very efficiently bypasses network security devices at the border (which is the only place home users have security) and would be an extremely valuable attack vector.


_________________
Nothing posted here should be construed as the opinion or position of my company, or an official position of WrongPlanet in any way, unless specifically mentioned.


MyFutureSelfnMe
Veteran
Veteran

User avatar

Joined: 26 Feb 2010
Age: 44
Gender: Male
Posts: 1,385

09 Oct 2012, 10:00 pm

Only a fraction of TV is actually really live. If broadcasters wanted to, they could deliver encrypted content in advance and send keys in real time as the content is intended to be viewed. But most people I know, myself included, skip broadcasts entirely and get shows a day later via DVR or bit torrent. I am a busy individual and if I'm going to watch something at a date and time determined by someone else, it better be a Shakespearean masterpiece.



MyFutureSelfnMe
Veteran
Veteran

User avatar

Joined: 26 Feb 2010
Age: 44
Gender: Male
Posts: 1,385

09 Oct 2012, 10:31 pm

Although the live nature of TV should actually make it easier to move the data both wirelessly and on cable networks - cable obviously already has the bandwidth, it's just moving the data in QAM format as opposed to a packet format; multicasting wirelessly is obviously possible since OTA stations do it, in some cases at remarkably high bit rates. Just need new hardware, new protocols. In the meantime, I'll keep bit torrenting shows that as a DirecTV subscriber I'm already paying for anyway. It only takes a few minutes to download a 1GB 40-minute long show.

DirecTV probably broadcasts under 100 channels that are actually network television that anyone cares about. During prime time, probably 80% of users are watching the same 5 or so channels. I tried an OTA antenna about a year ago on the Upper West Side of Manhattan and got 72 channels.