That's a small layout for 60mb. 1500 rows x 100 columns = 150000 cells.
If your file is 60mb, then each cell is
60 x1024x1024 bytes/150000 = 412bytes/cell.
But doubles are stored in 8 bytes, right? So this makes sense only if
you have a lot of longish strings. If you have string longer than 32k
bytes, then Excel 2010 will not like it.
However, re-reading your original post, it seems like your error is
about loading a file, not writing it - which is it?
Jeph
On 1/29/2013 4:17 PM, David Epstein wrote:
Right, and my datasets are nowhere near that big anyway: ~1500 rows
and ~100 columns each.
As I understand it, Stata reads the entire Excel workbook into its
memory before changing it, even if I'm only changing data in one
column of one sheet. Perhaps that's where the overload is coming from,
although, again, the entire workbook is only about 60Mb big.
On Tue, Jan 29, 2013 at 3:38 PM, Friedrich Huebler <fhuebler@gmail.com> wrote:
The problem is not Stata - you are probably exceeding the limits of Excel:
65,536 rows by 256 columns for Excel 2010.
You can get half way there if you write as a comma separated file, but then
Excel still can't open it.
cheers,
Jeph
On 1/29/2013 2:14 PM, David Epstein wrote:
Dear Statalisters,
I have a large dataset that I want to write to an Excel file, so I was
excited about Stata 12's new, improved import excel/export excel
features. However, the file I'm writing to is ~60Mb in size, and I
keep getting an error "file testsheet.xlsx could not be loaded". This
does not happen when I try to write to a smaller, or blank excel
sheet, I increased Stata's memory to much more than 60Mb, and I even
added the undocumented "set excelxlsxlargefile on" to my code, all to
no avail.
Anyone have any ideas on how to solve this? Any help appreciated.
David