List<UserEo> userEoList = comDataComponent.getUserListByUid(); for (UserEo userEo : userEoList) { List<DataPermissionSchema> userDataPermissionSchema = getUserDataPermissionSchema(userEo.getUserUid(), userEo.getOrgCode()); //将当前用户的数据权限 存入redis key = SystemsConstant.DATA_PERMISSION_SCHEMA + UserUid redisTemplate.opsForValue().set(SystemsConstant.DATA_PERMISSION_SCHEMA + userEo.getUserUid(), ObjectUtil.toString(userDataPermissionSchema)); }假设userEoList数据为一万条,怎么优化
时间: 2024-02-03 15:11:50 浏览: 194
如果userEoList数据量很大,可以考虑使用多线程来进行并发处理,以提高处理效率。可以将用户列表分成若干个较小的批次,每个批次开启一个线程处理。同时,为了避免对Redis的并发访问,可以使用Redis的pipeline批量操作来减少网络开销。
例如,可以将代码改为如下形式:
```java
List<UserEo> userEoList = comDataComponent.getUserListByUid();
int batchSize = 1000; // 每批次处理1000个用户
int threadCount = (userEoList.size() + batchSize - 1) / batchSize; // 计算需要开启的线程数
ExecutorService executorService = Executors.newFixedThreadPool(threadCount);
for (int i = 0; i < threadCount; i++) {
int start = i * batchSize;
int end = Math.min(start + batchSize, userEoList.size());
List<UserEo> subList = userEoList.subList(start, end);
executorService.submit(() -> {
for (UserEo userEo : subList) {
List<DataPermissionSchema> userDataPermissionSchema = getUserDataPermissionSchema(userEo.getUserUid(), userEo.getOrgCode());
String redisKey = SystemsConstant.DATA_PERMISSION_SCHEMA + userEo.getUserUid();
redisTemplate.executePipelined((RedisCallback<Object>) connection -> {
connection.set(redisTemplate.getKeySerializer().serialize(redisKey), redisTemplate.getValueSerializer().serialize(ObjectUtil.toString(userDataPermissionSchema)));
return null;
});
}
});
}
executorService.shutdown();
executorService.awaitTermination(Long.MAX_VALUE, TimeUnit.SECONDS);
```
这样,就可以使用多线程并发处理大量用户数据,并且通过使用Redis的pipeline批量操作来减少网络开销,提高性能。
阅读全文