Back

Latency

TL/DR: Latency is the delay between a user's action and the system's response, commonly measured in milliseconds. In tech, lower latency means faster responsiveness, crucial for real-time applications like video streaming, online gaming, and surveillance.

What is Latency?

Latency is the time it takes for data to travel from a source to a destination and for the system to respond. It is usually measured in milliseconds and is a key factor in determining how quickly a user action receives a response. In digital systems, latency includes delays in processing, transmitting, and receiving data. For example, in video surveillance, latency refers to the delay between a live event occurring and the display of that event on a monitor.

Latency affects applications that require immediate feedback, such as online gaming, live streaming, and real-time monitoring. Higher latency can cause delays that disrupt user experience, while lower latency allows for smoother, more synchronized interactions.

What is CAS Latency?

CAS latency, or Column Address Strobe (CAS) latency, is the delay time between when a memory controller requests data from a specific column in a computer's RAM (Random Access Memory) and when that data is actually available. CAS latency is measured in clock cycles, with a lower CAS latency indicating faster memory response times.

In technical terms, if a memory module has a CAS latency of 16, it takes 16 clock cycles for the data to be ready after the request. CAS latency is one of several factors that affect memory performance, with lower CAS values generally improving speed, especially in applications that frequently access RAM, such as gaming or high-performance computing tasks.

What is a Good Latency Speed?

A good latency speed depends on the application:

  • Internet and Web Browsing: Latency under 100 milliseconds (ms) is generally acceptable for standard browsing. Under 20 ms is excellent for a smooth experience.
  • Online Gaming: Latency under 50 ms is ideal, as gaming requires fast response times for smooth play. Between 50-100 ms is workable but may lead to minor lag; over 100 ms can significantly impact gameplay.
  • Video Streaming (Live or Conferencing): Latency below 50 ms provides optimal experience, especially in real-time communication applications like video calls. Latency up to 200 ms may still be tolerable, but it could cause noticeable delay in conversation flow.
  • Video Surveillance: Low latency, ideally under 100 ms, is critical for real-time monitoring, especially in high-security environments where instant response is essential.

In general, lower latency improves performance in applications requiring real-time responsiveness, but what is "good" often varies by the specific needs of each use case.