Page 1 of 1

Failed to open document

Posted: Wed Aug 18, 2004 3:36 pm
by tferguso
I'm looking for some incite into this problem. I have a big (695 Mb) that I am trying to open. It's a simple text file, 36,000,000 lines long. So far, Textpad is the only editor I've been able to open it with.

Recently, on one of the computers I'm using to try to open it (the one with more memory and a newer CPU) Textpad craps out with this message: "Failed to open document".

Since Textpad has opened this document on this computer before, and since it is opening this document on another, older computer now - I don't understand what's going on.

Can anyone give me any incite as to when Textpad displays this message and why?

Posted: Wed Aug 18, 2004 4:17 pm
by s_reynisson
Has the amount of available virtual memory been changed lately on your latest and greatest?
Also, pls note that word wrap slows things down when opening large files. See the Known Problems forum. HTH

Posted: Wed Aug 18, 2004 6:16 pm
by tferguso
Well,
I've never adjusted the size of my virtual memory, but I just tried changing that. That system has 1 Gb of memory, and I changed the virtual memory to 2.5 Gb, but that didn't help.

I've turned off both Word Wrap and Undo (it seriously slows things down when trying to delete a few million lines of a file and eats up lots of memory)

I still have the same problem. I believe it's a problem with Windows (my first thought when I use Windows). This is version XP Pro. I'm just not sure what part of Windows to look into.

Any other thoughts?

Posted: Wed Aug 18, 2004 6:59 pm
by s_reynisson
What about your TP version? Latest is 473.
Also, did you just install SP2 for XP? I did a few day's ago but I have no problems here. Just opend a 377mb file in about 3 seconds. 1 gb file loads in about 110 seconds with a lot of disk swapping going on. Running XP on 1 gb memory and system managed swap file.

Posted: Wed Aug 18, 2004 9:11 pm
by helios
This may be dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.

http://www.sysinternals.com/ntw2k/freew ... frag.shtml

I hope this helps.

Posted: Thu Aug 19, 2004 11:59 am
by helios
A footnote to my previous response.

In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.

Failed to open document

Posted: Tue Apr 17, 2012 2:10 pm
by bobh128
I am having the same problem. I am trying to open a text file that is 1.62GB but I get the error "Failed to open document". It seems to be parsing and getting about 75% done before it stops. Any suggestions? I am running Windows 7 Ultimate 64 bit with 6GB of RAM.

On a side note I am running a memory tool called All CPU Meter that monitors CPU and memory usage. Before opening the file it says I have 4400MB free. As the file opens that amount starts to go down. When the error occurs the amount of memory free is about 2600MB. I think I have plenty of memory.

My page file is 6135MB. I tried defragmenting and the system says I have 0% fragmentation.

Edit: I'm using TextPad 5.4.2

Failed to open document

Posted: Wed May 09, 2012 2:13 pm
by bobh128
Any ideas? Anyone?

Also, word wrap is disabled.

Posted: Sat May 12, 2012 10:41 am
by woho
my idea is not about TextPad:
split the file in several parts
there is lots of freeware for download available

Posted: Mon May 14, 2012 2:27 pm
by bobh128
Thanks for the suggestion, however I'm still having trouble. I tried GSplit 3, TextSplitter, and JSplit and I'm not able to do on any of these what I need to do.

What I need to do is take the text file, make changes throughout the file, and save it. I cannot edit the whole file in TextPad because it is too big. If I split the file into pieces and edit the pieces, GSplit refuses to merge the pieces back together. TextSplitter gives me "Out of Memory." error and JSplit gives me "Can not split, Size is too large!" error.

What software do you recommend woho?

Posted: Mon May 14, 2012 4:45 pm
by ak47wong
I listed a few file splitting utilities in this post that you could try.

Posted: Mon May 14, 2012 6:48 pm
by bobh128
I was able to get HJSplit and FFSJ both to work. I like FFSJ best though because it did the job in the shortest time. Thank you.

On a side note, trying to open a 1.7GB text file, I also was getting the error "Disk full while accessing C:\file.txt.".