Python: Split Large File Chunks
When dealing with large files, you can use the following method to split them into smaller chunks:
- – unlock()
- could you please read this?
Can you read this?
Would you mind reading this? - For each block, operations can be performed as needed, such as data processing or writing to other files.
- Repeat steps 2 and 3 until the entire file has been read and processed.
- Close the file.
Here is an example code demonstrating how to divide a large file into smaller chunks.
def split_file(file_path, chunk_size):
with open(file_path, 'rb') as file:
chunk = file.read(chunk_size)
while chunk:
# 处理每个块,这里只是打印块的大小
print(len(chunk))
chunk = file.read(chunk_size)
# 使用示例
split_file('large_file.txt', 1024) # 每个块的大小为1024字节
In the above example, the split_file() function takes a file path and the size of the chunks as parameters. It opens the file using a with statement and reads each chunk using the read() function. It then performs the desired operation on each chunk, which in this case is printing the size of the chunk. This process is repeated until the entire file is read and processed.