PCI-SIG specifies a bit error rate (BER) threshold of 1E-12 or better for PCI Express 3.0. What might cause this threshold to be exceeded, and what happens if it is?
Determining the margins of a system requires taking a statistically large-enough data collection sample size to achieve a meaningful result. The sample size takes into account variances due to silicon process, temperature, voltage, finite test time, and a number of other factors. What is the math behind this?
We know that the board bring-up process can be extremely challenging on todayโs complex, high-speed designs. How do we get the iterative validation, test and debug steps off of the critical path of new product introduction?