Notice: On March 31, it was announced that Statalist is moving from an email list to a forum. The old list will shut down on April 23, and its replacement, statalist.org is already up and running.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: mac editor for big files
Nick Cox <firstname.lastname@example.org>
Re: st: mac editor for big files
Mon, 22 Aug 2011 01:16:28 +0100
In addition to other suggestions, look at -hexdump-.
On Sun, Aug 21, 2011 at 4:02 PM, Richard Goldstein
> thanks to Phil and Eric
> some responses and what I have done so far:
> the reason for wanting an editor for these files is that Stata will not
> correctly import them using -insheet- (or anything else I tried) due to
> "problems" in the data (e.g., one file had 2 successive end-of-line
> characters in the middle of what was supposed to be a line of data); so
> I want to look at the small number of files with apparent problems to
> see if I can fix them prior to importing; since I don't know what the
> problem is or even where it occurs, I have been unable to figure out how
> to use -filefilter-
> I downloaded micvim, ultraedit (trial version) and two different emacs
> versions; ultraedit did open the files but was so slow at even moving
> within the file that I deleted the program; the emacs programs were a
> little better but not much; micvim appears to work well, though I still
> haven't figured out how to get it to show what are ordinarily "hidden"
> characters (e.g., tab, eol); I need to see these to "fix" at least two
> files; however, just opening the files solved one problem for me: it
> told me how many lines of data there were supposed to be so I can check
> the results of importing
> On 8/19/11 5:34 PM, Eric Booth wrote:
>> One more suggestion for working with large text files from Stata is to look at the user-written files -chewfile- and -chunky- (H/T to Dan Blanchette for pointing me to -chunky-) from SSC for breaking the large file up into manageable pieces.
>> I hadn't see the LargeFile plugin for Vim that Phil mentioned - very helpful.
>> - Eric
>> On Aug 19, 2011, at 4:23 PM, Eric Booth wrote:
>>> I usually use TW as well for large files, but I guess I haven't opened one that is larger than 1GB since I've never had one choke. I'd try Vim or UltraEdit -- I've read that both can handle large (>1 or 2 GB ) text files. Also, VEDIT can handle text files up to 2 GB, but it's not free.
>>> However, you can load large text files into Stata via commands like -insheet-, or if it's not a dataset you can import with -intext- (From SSC). Also, you can view just a part of the file from terminal by using the 'head' or 'tail' commands to see the start or end of the large text file very quickly (e.g., head -n 100 myfile.txt to see the first 100 lines). Finally, if you need to edit or make changes to the file before importing into Stata as a dataset, consider using -filefilter- to make changes to the file.
>>> - Eric
>>> On Aug 19, 2011, at 4:02 PM, Richard Goldstein wrote:
>>>> I currently use Text Wrangler; however, neither it nor BBEdit will,
>>>> apparently, edit ascii files >1 GB in size (I have 32 GB of RAM); does
>>>> anyone know of an editor for the Mac that will edit files of >1 GB?
* For searches and help try: