I have an excel file around 70MB and by just even loading it is causing memory to jump to 600mb. I wished if I had a csv file so I could have loaded the file in chunks line by line but that’s not the case here. I will be having am excel file coming that woul be around 2GB and I have to process it, so any tips/tricks to get around this memory issue?
Here is my code below:
def test_excel(excel_file): try: df = pd.read_excel(excel_file, usecols=[0, 5, 7, 12]) print(f"Successfully done") except Exception as e: print(f"Error: {e}") # Example usage: test_excel("final.xlsx")
submitted by /u/lofi_thoughts
[link] [comments]
r/learnpython I have an excel file around 70MB and by just even loading it is causing memory to jump to 600mb. I wished if I had a csv file so I could have loaded the file in chunks line by line but that’s not the case here. I will be having am excel file coming that woul be around 2GB and I have to process it, so any tips/tricks to get around this memory issue? Here is my code below: def test_excel(excel_file): try: df = pd.read_excel(excel_file, usecols=[0, 5, 7, 12]) print(f”Successfully done”) except Exception as e: print(f”Error: {e}”) # Example usage: test_excel(“final.xlsx”) submitted by /u/lofi_thoughts [link] [comments]
I have an excel file around 70MB and by just even loading it is causing memory to jump to 600mb. I wished if I had a csv file so I could have loaded the file in chunks line by line but that’s not the case here. I will be having am excel file coming that woul be around 2GB and I have to process it, so any tips/tricks to get around this memory issue?
Here is my code below:
def test_excel(excel_file): try: df = pd.read_excel(excel_file, usecols=[0, 5, 7, 12]) print(f"Successfully done") except Exception as e: print(f"Error: {e}") # Example usage: test_excel("final.xlsx")
submitted by /u/lofi_thoughts
[link] [comments]