1 / 44

Access Networks: Applications and Policy

Access Networks: Applications and Policy. Nick Feamster CS 6250 Fall 2011. (HomeOS slides from Ratul Mahajan). Huge amount of tech in homes. Home users struggle. Management Nightmare. Integration Hurdles. Why developers are not helping. Vendors only build islands.

cara
Download Presentation

Access Networks: Applications and Policy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Access Networks:Applications and Policy Nick FeamsterCS 6250Fall 2011 (HomeOS slides from Ratul Mahajan)

  2. Huge amount of tech in homes

  3. Home users struggle • Management Nightmare • Integration Hurdles

  4. Why developers are not helping

  5. Vendors only build islands • Vertically integrate hardware and software • Seldom make use of other vendors’ devices • No single vendor comes close to providing all the devices a home needs

  6. Interoperability is not sufficient • Media: DLNA, AirTunes, etc. • Devices: UPnP, SpeakEasy, mDNS, etc. • Home Auto: ZwaveZigBee, X10, etc. Video Recording Climate Control Camera-Based Entry Remote Lock

  7. Monolithic systems are inextensible • Security: ADT, Brinks, etc. • Academic: EasyLiving, House_n, etc. • Commercial: Control4, Elk M1, Leviton, etc. Home Media Security

  8. An alternative approach: A home-wide operating system HomeStore Video Rec. Remote Unlock Climate Operating System

  9. Goals of HomeOS • Simplify application development • Enable innovation and device differentiation • Simplify user management

  10. Simplify development App A App B … …

  11. Simplify development App A App B … Mgmt UI Access Control Port Port Driver Driver … … …

  12. Roles in HomeOS • Roles are functional descriptions of ports • lightswitch, television, display, speakers, etc. • App developers program against roles • Enable vendors to innovate/differentiate • Anyone can create a new role • e.g., SonyBraviaTV vs. television • Allows new functionality to be rapidly exposed • Commodity vendors can still participate

  13. Simplify user management • Conducted a field study • Modern homes with automation & other tech • 14 homes, 31 people • Users’ needs for access control • Applications as security principals • Time in access control decisions • Confidence in their configuration

  14. Management primitives • Datalog access control rules • (port, group, module, time-start, time-end, day, priority, access-mode) • Reliable reverse perspectives help users confidently configure access control • User accounts • Can be restricted by time (guests) • Application manifests • Specify role requirements for compatibility testing • Simplifies rule setup (only when roles match)

  15. Implementation status • Built on the .NET CLR • ~15,000 lines of C# • ~2,500 kernel • 11 Applications • Average ~300 lines/app • Music Follows the Lights • Play, pause & transfer music where lights are on/off • Two-factor Authentication • Based on spoken password and face recognition

  16. Open questions/Ongoing work • Additional evaluation • Is it easy to write apps and drivers? • Is it easy to manage? • Does it scale to large homes? • Deploy & support application development • Explore business/economic issues

  17. Summary • A home-wide OS can make home technology manageable and programmable • HomeOS balances stakeholder desires • Developers: abstracts four sources of heterogeneity • Vendors: enables innovation and differentiation • Users: provides mgmt. primitives match mental models http://research.microsoft.com/homeos

  18. Detecting Network Neutrality Violations with Causal Inference Mukarram Bin Tariq, Murtaza Motiwala Nick Feamster, Mostafa Ammar Georgia Techhttp://gtnoise.net/nano/

  19. The Network Neutrality Debate Users have little choice of access networks. ISPs want to “share” from monetizable traffic that they carry for content providers. November 6, 2006

  20. Goal: Make ISP Behavior Transparent Source: Glasnost project Our goal: Transparency.Expose performance discrimination to users.

  21. Existing Techniques are Too Specific • Detect specific discrimination methods and policies • Testing for TCP RST packets (Glasnost) • ToS-bits based de-prioritization (NetPolice) • Limitations • Brittle: discrimination methods may evolve • Evadable • ISP can whitelist certain servers, destinations, etc. • ISP can prioritize monitoring probes • Active probes may not reflect user performance • Monitoring is not continuous

  22. Main Idea: Detect Discrimination From Passively Collected Data This talk: Design, implementation, evaluation, and deployment of NANO Objective: Establish whether observed degradation in performance is caused by ISP Method:Passively collect performance data and analyze the extent to which an ISP causes this degradation

  23. Ideal: Directly Estimate Causal Effect Causal Effect= E(Real Throughput using ISP) E(Real Throughput not using ISP) Performance with the ISP Baseline Performance “Ground truth” values for performance with and without the ISP (“treatment variable”) Problem: Need both ground truth values observed for same client. These values are typically not available.

  24. Instead: Estimate Association from Observed Data Observed Performance with the ISP Association= E(Observed Throughput using ISP) E ( Observed Throughput not using ISP) Observed Baseline Performance Problem: Association does not equal causal effect. How to estimate causal effect from association?

  25. Association is Not Causal Effect Why? Confounding variablescan confuse inference. • Suppose Comcast users observe lower BitTorrent throughput. • Can we assume that Comcast is discriminating? • No! Other factors (“confounders”) may correlate with both the choice of ISP and the output variable. ClientSetup Time of Day Comcast ? Location Content BTThroughput

  26. Strawman: Random Treatment Common approach in epidemiology. H S S S H S S S H Untreated Treated H H H H S H S S S = 0.8 - 0.25 = 0.55 α  θ S = “sick”H = “healthy” Treat subjects randomly, irrespective of their initial health. Measure association with new outcome. Association converges to causal effect if the confounding variables do not change during treatment.

  27. The Internet Does Not Permit Random Treatment Alternate approach: Stratification • Random treatment requires changing ISP. • Problems • Cumbersome: Nearly impossible to achieve for large number of users • Does not eliminate all confounding variables (e.g., change of equipment at user’s home network)

  28. Stratification: Adjusting for Confounders Causal Effect (θ) 0.55 -0.11 Strata H H H H H Treated H H H H H H H H S S S S S S S S 0.75 0.44 H S Baseline H H H H H S S S S S S S 0.20 0.55 Step 1:Enumerate confounderse.g., setup ={ , } Step 2:Stratify along confounder variable values and measure association Association implies causation (no otherexplanation)

  29. Stratification on the Internet: Challenges What is baseline performance? What are the confounding variables? Which data to use, and how to collect it? How to infer the discrimination method?

  30. What is the baseline performance? • Baseline: Service performance when ISP not used • Need some ISP for comparison • Approach: Average performance over other ISPs • Limitation: Other ISPs may also discriminate

  31. What are the confounding variables? • Client-side • Client setup: Network Setup, ISP contract • Application: Browser, BT Client, VoIP client • Resources: Memory, CPU, network utilization • Other: Location, number of users sharing home connection • Temporal • Diurnal cycles, transient failures

  32. What data to use; how to collect it? http://www.gtnoise.net/nano/ • NANO-Agent: Client-side, passive collection • per-flow statistics: throughput, jitter, loss, RST packets • application associated with flow • resource monitoring • CPU, memory, network utilization • Performance statistics sent to NANO-Server • Monitoring, stratification, inference

  33. Evaluation: Three Experiments Experiment 1: Simple Discrimination • HTTP Web service • Discriminating ISPs drop packets Experiment 2: Long Flow Discrimination • Two HTTP servers S1 and S2 • Discriminating ISPs throttle traffic for S1 or S2 if the transfer exceeds certain threshold Experiment 3: BitTorrent Discrimination • Discriminating ISP maintains list of preferred peers • Higher drop rate for BitTorrent traffic to non-preferred peers

  34. Experiment Setup Clients Running NANO-Agent D1 D2 N1 N2 N3 Internet ISPs Access ISP 5 ISPs in Emulab 2 Discriminating Service Providers PlanetLab nodes HTTP and BitTorrent Discrimination Throttling and dropping Policy with Click router Confounding Variables Server location near servers (West coast nodes) far servers (remaining PlanetLabnodes) ~200 PlanetLab nodes

  35. Without Stratification, Detecting Discrimination is Difficult Simple Discrimination Overall throughput distribution in discriminating and non-discriminating ISPs is similar.

  36. Stratification Identifies Discrimination Discriminating ISPs have clearly identifiable causal effect on throughput Simple Long-Flow BitTorrent Neutral ISPs are absolved

  37. Implementation and Deployment http://gtnoise.net/nano/ Performance Relative to Other Users DNS Latency Traffic Breakdown Throughput • Implementation • Linux version available • Windows and MacOS versions in progress • Now: 27 users • Need thousands for inference • Performance dashboard may help attract users

  38. Summary and Next Steps • Internet Service Providers discriminate against classes of users and application traffic today. • Need passive approach • ISP discrimination techniques can evolve, or may not be known to users. • Tradeoff: Must be able to enumerate confounders • NANO: Network Access Neutrality Observatory • Infers discrimination from passively collected data • Detection succeeds in controlled environments • Deployment in progress. Need more users. http://gtnoise.net/nano/

  39. NANO Can Infer Discrimination Criteria Approach Evaluation ISP throttles throughput of a flow larger than 13MB or about 10K packets cum_pkts <= 10103 -> not_discriminated cum_pkts > 10103 -> discriminated

  40. Sufficiency of Confounding Variables

  41. Why Association != Causal Effect? Sleep Aspirin Diet ? Health Age OtherDrugs • Positive correlation in health and treatment • Can we say that Aspirincauses better health? • Confounding Variables correlate with both cause and outcome variables and confuse the causal inference

  42. Causality: An Analogy from Health Epidemiology: study causal relationships between risk factors and health outcome NANO: infer causal relationship between ISP and service performance degradation

  43. Without Stratification, Detecting Discrimination is Hard Simple Discrimination Experiment Long Flow Discrimination Experiment Overall throughput distribution in discriminating and non-discriminating ISPs is similar.Server location is confounding.

More Related