soc/software/liblitedram: fix max error count computation

READ_CHECK_TEST_PATTERN_MAX_ERRORS was being computed incorrectly
which could result in integer underflows when computing core via
(max_errors - errors). This could happen e.g. when using
DFIRateConverter, which modifies DFII_PIX_DATA_BYTES. Now we use
DFII_PIX_DATA_BYTES in the equation so this should not happen.
This commit is contained in:
Jędrzej Boczar 2021-08-30 09:28:17 +02:00
parent 1d27e25cbb
commit f943092bb5
1 changed files with 1 additions and 1 deletions

View File

@ -300,7 +300,7 @@ static void print_scan_errors(unsigned int errors) {
#endif
}
#define READ_CHECK_TEST_PATTERN_MAX_ERRORS (8*SDRAM_PHY_PHASES*SDRAM_PHY_XDR)
#define READ_CHECK_TEST_PATTERN_MAX_ERRORS (8*SDRAM_PHY_PHASES*DFII_PIX_DATA_BYTES/SDRAM_PHY_MODULES)
static unsigned int sdram_write_read_check_test_pattern(int module, unsigned int seed) {
int p, i;