1 / 17

Framework, Usages, Metrics Proposal for TGt

Framework, Usages, Metrics Proposal for TGt. Pratik Mehta, Fahd Pirzada – Dell Paul Canaan – Intel Amer Hassan, Don Berry – Microsoft September 2004. Excerpt from Nov 2003 Presentation – Schedule Plan.

fairly
Download Presentation

Framework, Usages, Metrics Proposal for TGt

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Framework, Usages, Metrics Proposal for TGt Pratik Mehta, Fahd Pirzada – Dell Paul Canaan – Intel Amer Hassan, Don Berry – Microsoft September 2004

  2. Excerpt from Nov 2003 Presentation – Schedule Plan The milestones set out in Nov 2003 are on track … WPP/TGt have been meeting expectations …  Presenter: Stephen Berger, Reference doc.: 11-03-0929-00-0wng

  3. Excerpt from Nov 2003 Presentation – Schedule Plan The milestones set out in Nov 2003 are on track … WPP/TGt have been meeting expectations …     Presenter: Stephen Berger, Reference doc.: 11-03-0929-00-0wng

  4. Excerpt from Nov 2003 Presentation – Scope WPP/TGt continues to focus on the scope set out in Nov 2003 … Proposed Scope Presenter: Pratik Mehta, Reference doc.: 11-03-0927-00-0wng

  5. Excerpt from Nov 2003 Presentation – Metrics WPP/TGt continues to discuss metrics highlighted in Nov 2003 by Broadcom… • Deciding what parameters are to be considered is the challenge. • How do we transform user perception of performance into a set of repeatably-measurable quantities? • Throughput and Range • Visibility of APs • Delays in association • Host CPU utilization • Ability to roam without loss of connections • Etc. Slides 16-20 of that presentation delve further into metrics that are still relevant … … TGt can decide what makes sense … Presenter: Jason Trachewsky et al, Reference doc.: 11-03-0933-00-0wng

  6. Excerpts from Presentations – Metrics WPP/TGt continues to discuss usages metrics… • Roaming is important to consider • Roaming brings in many variables and causes that impact performance Presenter: Bob Mandeville, Reference doc.: 11-03-0617-00-0wng • More recent presentation on metrics by Bob … • Maximum Forwarding Rate • FWMOL (Fwd rate at max offered load) • Frame Loss, Frame Loss Rate • Latency • Jitter • Association Capacity, Association Rate • Rate versus Range Presenter: Bob Mandeville, Reference doc.: 11-04-0987-00-0wpp

  7. Excerpt from Recent WPP/TGt Presentation – Framework WPP/TGt has discussed a framework approach … below is as highlighted by Intel… Framework for WPP • Test Environment • Come to consensus on environment types (NLOS, LOS, Conductive, etc) • Develop a mechanism to characterize the environment • Propose recommended practices for the environmental set up • Metrics • Define metrics relevant to characterize the environment • Identify metrics that impact WLAN performance as well as corresponding methodology to obtain these metrics • Devise a mechanism to predict the metric performance • Test Case Template • Come to consensus on the necessary metrics to characterize the environment that should be reported in a test template • Define test case scenarios for each metric • Determine recommended practice for reporting/analyzing of the results • Usage • Come to consensus on usage scenarios that represent the WLAN traffic pattern (file transfers, streaming media, connectivity, voice over IP, etc) • Categorize which metrics have the greatest impact on the performance of those usage scenarios Presenter: Paul C. Canaan, Reference doc.: 11-04-0802-01-0wpp

  8. Interim Summary • TGt is focusing on the right approaches • TGt is also meeting deliveries that were highlighted 9 months ago • TGt has the right constituencies and experts represented • … In short, TGt is a healthy group, and can move forward and deliver • There is a call for urgency in the short-term to address some key industry needs • OEM needs during product development • Reviewer/Publisher needs for product reviews • Guidelines for system integrators • The following slides provide a proposal to move forward

  9. Restatement of TGt Framework Usage + Environment  Metrics  Methodology  Test Template …  Prediction Usage + Environment Metrics Sub-metrics Methodology Test Environment Test Procedure Test Template Use Measurements To Predict

  10. Definition of Terms in TGt Framework Usage + Environment  Metrics  Methodology  Test Template …  Prediction Usage + Environment • Usage and environment of end-user Metrics Sub-metrics • Metrics are the key performance indicators as perceived by users – for the usages they employ in their environments (above) • Sub-metrics are drivers of those metrics • Sub-metrics help understand what affects the metrics Methodology Test Environment Test Procedure Test Template • Methodology is the way to measure metrics and sub-metrics • Test Environment is a model of the user’s environment • Test Procedure is the set of steps to perform measurements • Test Template is the content and format used to report the measurements Use Measurements To Predict • Output from Methodology and Measurements are used to predict performance

  11. Proposal – Focus on three Cases Usage + Environment  Metrics  Methodology  … • 1. Data Oriented Applications + Home, Corporate Environment • Metrics: • Throughput, Range, Directionality • Sub-metrics: • Path loss, EVM measurements, Interference, Multipath, Noise, SNR, Receive sensitivity, Transmit power • Methodology: • Chariot scripts – Chamber, Indoor • Web-downloads – Indoor • File-Transfers, File-Sharing – Indoor • Rotating tables • 3. Latency Sensitive Apps + Home, Corporate Environment • Metrics: • All of 1, Latency, Jitter • Sub-Metrics: • MOS/PESQ, PSQM, One-way delay, Packet loss • Methodology: • Decoder/Encoder characteristics, Network protocols (H.323, P2P, SIP) • Apps: Netmeeting, Skype, Vonage • 2. Streaming Media Apps + Home, Corporate Environment • Metrics: • All of 1, Frames dropped, Image quality • Sub-Metrics: • TCP Re-transmissions, Packet errors, Latency, AV synchronization, Control Channel (Gaming, Multicast), Image quality (Jitter, Blockiness, Blur), Bandwidth stability, Buffering (pre-roll buffer, jitter buffer) • Methodology: • Unicast, Multicast, Real-time streaming, Stored content, Decoder/Encoder characteristics, Network protocols (TCP, UDP), Buffering • Number of streams, • Apps: Bobsled, VideoLAN, Real server. Each of these Usage cases will need to be partitioned … (next slide shows example for #1) Attributes common to all three usages to consider: No. of Clients; No. of APs; Stationary/Roaming; Encryption/Authentication

  12. Partitioning Approach for Usage Case #1 Usage + Environment  Metrics  Methodology  Test Template … 1. Data Oriented Applications + Home, Corporate Environment • Chariot Throughput + Indoor, Chamber Environment • Metrics: • Throughput, Range, Directionality • Sub-Metrics • Path loss, EVM measurements, Interference, Multipath, Noise, SNR, Receive sensitivity, Transmit power • Methodology: ? • Web Downloads + Indoor Environment • Metrics: • Latency, Throughput, Range • Sub-Metrics • Path loss, EVM measurements, Interference, Multipath, Noise, SNR, Receive sensitivity, Transmit power • Methodology: ? • File transfer, File sharing + Indoor Environment • Metrics: • Throughput, Range • Sub-Metrics • Path loss, EVM measurements, Interference, Multipath, Noise, SNR, Receive sensitivity, Transmit power • Methodology: ?

  13. Similar Partitioning is needed for #2 and #3 • To be filled out

  14. Proposed Timeline for TGt • Metrics/Sub-metrics, Methodology development Letter Ballot Mar’05 Framework Adoption Report In Berlin Sep’04 Framework Adoption Pre-Berlin 9thSep’04 Usage Case #1 Nov’04 Usage Case #2 Usage Case #3 Jan’05

  15. Next Steps • Get agreement on the framework and approach • Get agreement that the three usage cases presented are the one we can focus on • Align in Berlin meeting with wider audience in TG • Develop the three usage cases further • Sanitize and update the metrics and sub-metrics • Hone in on the Methodology components, test environment, procedure, etc. • Finalizing the test case templates for reporting and analysis • Usable outputs for Usages #1, #2, #3 (see timeline) • Draft recommended practices specification • Include definitions and terminology

  16. Backup

  17. Parking Lot for Metrics and Sub-metrics Proposing a “parking-lot” list of parameters – need to determine whether they are metrics or sub-metrics, or part of the methodology, and which of the usage cases they apply to … Performance metrics by OSI model layers

More Related