"Disk full" message
Moderators: AmigoJack, bbadmin, helios, Bob Hansen, MudGuard
"Disk full" message
I sometimes find it convenient to use TextPad to read or browse certain large ASCII text files in my work.
Lately, the size of one of the huge files I regularly consult has grown a bit, and I find that TextPad will no longer open it.
The attempt to open the file ends with the message box, "Disk full while accessing C:\<filename>." , where <filename> is my file of interest.
The increment in filesize made me suspect that perhaps I needed more RAM in order to open a file of such size: the file opened fine when it was "only" 679 MB, but will not open now that it is 750 MB in size.
At that time, I had 1 GB of RAM in the Dell Dimension 4600 machine.
Today I increased the RAM to 2 GB, and the file will not open (I get the same message).
I've seen the advisory here about this message, with the advice to try defragging the hard drive.
That has had no good effect; the file remains unopenable.
So do other files of that size.
I run under WIN-XP-Pro, and have a h.d. of 80 GB with 60 GB free.
In XP, I've increased the size of the Virtual Memory to 3067 MB, while it had been 1023 MB; but this has given no good effect. "Total paging file size for all drives" is now 3067 minimum and 4095 max.
Now, I know the claim is made that TextPro can open files as large as the size of virtual memory; yet, even now while I've tripled the size of virtual memory and doubled the size of physical memory, I cannot open a file of 750 MB size, while a 679 MB file opens (and can be operated on) fine.
I downloaded and installed TextPad 4.7.3 today, and ran it, but it will not open the larger file (I had been using TextPad4.6.3 previously).
What can I do to TextPad or to XP to allow TextPad to open such large files? Thanks,
--Joe / Lunar and Planetary Lab / Tucson
Lately, the size of one of the huge files I regularly consult has grown a bit, and I find that TextPad will no longer open it.
The attempt to open the file ends with the message box, "Disk full while accessing C:\<filename>." , where <filename> is my file of interest.
The increment in filesize made me suspect that perhaps I needed more RAM in order to open a file of such size: the file opened fine when it was "only" 679 MB, but will not open now that it is 750 MB in size.
At that time, I had 1 GB of RAM in the Dell Dimension 4600 machine.
Today I increased the RAM to 2 GB, and the file will not open (I get the same message).
I've seen the advisory here about this message, with the advice to try defragging the hard drive.
That has had no good effect; the file remains unopenable.
So do other files of that size.
I run under WIN-XP-Pro, and have a h.d. of 80 GB with 60 GB free.
In XP, I've increased the size of the Virtual Memory to 3067 MB, while it had been 1023 MB; but this has given no good effect. "Total paging file size for all drives" is now 3067 minimum and 4095 max.
Now, I know the claim is made that TextPro can open files as large as the size of virtual memory; yet, even now while I've tripled the size of virtual memory and doubled the size of physical memory, I cannot open a file of 750 MB size, while a 679 MB file opens (and can be operated on) fine.
I downloaded and installed TextPad 4.7.3 today, and ran it, but it will not open the larger file (I had been using TextPad4.6.3 previously).
What can I do to TextPad or to XP to allow TextPad to open such large files? Thanks,
--Joe / Lunar and Planetary Lab / Tucson
With reference to the Disk full error.
This is dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.
http://www.sysinternals.com/ntw2k/freew ... frag.shtml
In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.
You could also try increasing the amount of available virtual memory as follows:
From the Start menu, choose:
1. Settings
2. Control pane
3. System
4. Advanced
5. Performance options
6. Change
7. Make any changes from this dialog box
XP:
1. Settings
2. Control pane
3. System
4. Advanced
5. Performance / Settings
6. Performance Options / Advanced
7. Virtual Memory / Change
I hope this helps.
This is dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.
http://www.sysinternals.com/ntw2k/freew ... frag.shtml
In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.
You could also try increasing the amount of available virtual memory as follows:
From the Start menu, choose:
1. Settings
2. Control pane
3. System
4. Advanced
5. Performance options
6. Change
7. Make any changes from this dialog box
XP:
1. Settings
2. Control pane
3. System
4. Advanced
5. Performance / Settings
6. Performance Options / Advanced
7. Virtual Memory / Change
I hope this helps.
Helios Software Solutions
"Disk Full" message... persists
Thank you, Stephen. Kind of you to recommend PageDefrag. I downloaded and installed it, and used it according to your suggestions, and those in the author's (Mark Russinovich) notes. It is a nice product! And free of charge.
Alas, and alack... there was no effect on how TextPad operates on my computer; it still will not open any file larger than about 649 MB.
I have increased the size of virtual memory, also without good effect.
Does anyone have suggestions on what to try next to make TextPad live up to its promise to open files as large as the size of virtual memory? It would help me... . Thank you.
Another thought hits: are there command-line options (switches...) that I could include in the startup path for TextPad? Options or switches that affect the way large files are treated, that is, or the way that memory is made available to TextPad. I searched TextPad's HELP index, but "Command Line" seems not to be among its entries. Thanks again.
--Joe / Lunar and Planetary Lab / Tucson
I ran the normal defragger in XP both before and after running PageDefrag.helios wrote:With reference to the Disk full error.
This is dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.
http://www.sysinternals.com/ntw2k/freew ... frag.shtml
In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.
Alas, and alack... there was no effect on how TextPad operates on my computer; it still will not open any file larger than about 649 MB.
I have increased the size of virtual memory, also without good effect.
Does anyone have suggestions on what to try next to make TextPad live up to its promise to open files as large as the size of virtual memory? It would help me... . Thank you.
Another thought hits: are there command-line options (switches...) that I could include in the startup path for TextPad? Options or switches that affect the way large files are treated, that is, or the way that memory is made available to TextPad. I searched TextPad's HELP index, but "Command Line" seems not to be among its entries. Thanks again.
--Joe / Lunar and Planetary Lab / Tucson
- s_reynisson
- Posts: 939
- Joined: Tue May 06, 2003 1:59 pm
I could test the file on my system if there's no confidential data in it, just to take out the possibilty of some ghost chars.
I seem to remember some trouble with 7 bit ascii files, null chars and of course files with Unicode ones.
You could try opening the file in binary mode from the File->Open dialog.
You could try making sure that the file belongs to a class with a fixed width font and that you have word wrap turned off.
Try turning off all virus protection programs, try starting in safe mode.
From reading your post I'm guessing you have plenty of free HD space, but what about free space on the partition for your swap file? Are you sure it's on C:\ ?
Oh, and since you have new memory chips, it would be logical to run a test on them, ie. memtest86 or something like that.
All I can remember at the mo. HTH
I seem to remember some trouble with 7 bit ascii files, null chars and of course files with Unicode ones.
You could try opening the file in binary mode from the File->Open dialog.
You could try making sure that the file belongs to a class with a fixed width font and that you have word wrap turned off.
Try turning off all virus protection programs, try starting in safe mode.
From reading your post I'm guessing you have plenty of free HD space, but what about free space on the partition for your swap file? Are you sure it's on C:\ ?
Oh, and since you have new memory chips, it would be logical to run a test on them, ie. memtest86 or something like that.
All I can remember at the mo. HTH
Then I open up and see
the person fumbling here is me
a different way to be
the person fumbling here is me
a different way to be
"Disk Full" message still persisting
Thank you very much for those suggestions.
Yes, I have plenty of hd space remaining: about 60 GB is free, on the 80 GB, single drive, system. Swap file is on C:\ since it's my only drive and only partition.
The file(s) of interest are plain ASCII files of millions of 80-column lines. On each line is the name or number of an asteroid, and its measured position on a given data and at a given time, plus the brightness of the asteroid and the code number of the observatory on earth where the position and brightness were measured.
The files grow in size as more and more observations are added to them by the Minor Planet Center (MPC), at Harvard-Smithsonian Center for Astrophysics.
After one of the files was updated and grew in size from 649 MB to about 714 MB, I could no longer open that file w/ TextPad.
All along, I have never been able to open a second such file, which has been larger than 1 GB for several years: it is now 1.7 GB in size.
Of course, the files open just fine under program control (e.g., when opened for reading by my C- programs); it is just TexPad that cannot now open them (after they've become so large).
I tried your suggestion about opening the file(s) in binary mode, from the Open dialog in TextPad: both huge files open fine in that mode! And very quickly... . But the 16-char snippets of ASCII in the binary mode viewscreen are not very helpful to me: of course I wish for the full 80-col field, as before.
I don't believe that there are junk characters, etc., that are giving TextPad problems in opening the files in text mode. Also, again, the files are flat ASCII, so font is not an issue with them. Structure and format of the files also does not change between updates by the MPC: the files just get bigger.
So... it appears that size matters. ;-)
Unfortunately.
Word wrap is not an issue: the lines are 80-columns wide, all of them, with no exceptions. I also universally run TextPad w/ word wrap off (not tick-marked).
I installed and ran MEMTEST-86 (v. 3.2); a great program!; it took about an hour to test the 2 GB of RAM now installed (all OK). The WIN-XP-Pro system also reports that 2 GB are indeed present, but now I also know that the memory is good (thanks to this very nice diagnostic tool).
I never turn off virus protection. Why would it care if a file were 649 MB or 714 (I mean, *would* it care?).
Well, maybe there are still things I could try.
Are there command-line options or switches which I can use to help nudge a file to open?
Many thanks!
--Joe
Yes, I have plenty of hd space remaining: about 60 GB is free, on the 80 GB, single drive, system. Swap file is on C:\ since it's my only drive and only partition.
The file(s) of interest are plain ASCII files of millions of 80-column lines. On each line is the name or number of an asteroid, and its measured position on a given data and at a given time, plus the brightness of the asteroid and the code number of the observatory on earth where the position and brightness were measured.
The files grow in size as more and more observations are added to them by the Minor Planet Center (MPC), at Harvard-Smithsonian Center for Astrophysics.
After one of the files was updated and grew in size from 649 MB to about 714 MB, I could no longer open that file w/ TextPad.
All along, I have never been able to open a second such file, which has been larger than 1 GB for several years: it is now 1.7 GB in size.
Of course, the files open just fine under program control (e.g., when opened for reading by my C- programs); it is just TexPad that cannot now open them (after they've become so large).
I tried your suggestion about opening the file(s) in binary mode, from the Open dialog in TextPad: both huge files open fine in that mode! And very quickly... . But the 16-char snippets of ASCII in the binary mode viewscreen are not very helpful to me: of course I wish for the full 80-col field, as before.
I don't believe that there are junk characters, etc., that are giving TextPad problems in opening the files in text mode. Also, again, the files are flat ASCII, so font is not an issue with them. Structure and format of the files also does not change between updates by the MPC: the files just get bigger.
So... it appears that size matters. ;-)
Unfortunately.
Word wrap is not an issue: the lines are 80-columns wide, all of them, with no exceptions. I also universally run TextPad w/ word wrap off (not tick-marked).
I installed and ran MEMTEST-86 (v. 3.2); a great program!; it took about an hour to test the 2 GB of RAM now installed (all OK). The WIN-XP-Pro system also reports that 2 GB are indeed present, but now I also know that the memory is good (thanks to this very nice diagnostic tool).
I never turn off virus protection. Why would it care if a file were 649 MB or 714 (I mean, *would* it care?).
Well, maybe there are still things I could try.
Are there command-line options or switches which I can use to help nudge a file to open?
Many thanks!
--Joe
- s_reynisson
- Posts: 939
- Joined: Tue May 06, 2003 1:59 pm
Did a small test with a 1.1gb file and it looks like TP uses the same amount of memory for write and read only access (peaks at 1,7gb on my 1gb system), but you could try that, ie. right click on the file in explorer and set the read only attribute. Also see TP's help under "Command Line Parameters", there is one to open with read only access, -r.
Edit;Some real-time virus checkers actually scan txt files when they are opend. Just silly IMHO but I've seen it with my own eyes... temporarily disabling it just to make sure sounds reasonable to me.
Since my long shots only get longer, perhaps someone else would care to jump in?
Edit;Some real-time virus checkers actually scan txt files when they are opend. Just silly IMHO but I've seen it with my own eyes... temporarily disabling it just to make sure sounds reasonable to me.
Since my long shots only get longer, perhaps someone else would care to jump in?
Then I open up and see
the person fumbling here is me
a different way to be
the person fumbling here is me
a different way to be
"Disk Full" mssg still persisting
Quoting, and replying a smidgeon:
I had also earlier opened a DOS terminal window and looked at the "attrib" settings of my large files. They all only have the "A" attribute bit set, where "A" is for archive. I even cleared the "A" attribute experimentally and tried to open the file with TP after that: still no-go!
Anyone else have ideas or experience defeating the "Disk Full" mssg.?
--Joe
"Trouble-shooting is LOOKING for trouble."
That was something I had already tried, by other means, but hadn't mentioned: I found a check-box in the TP Open menu which allows us to open a file in TP for read-only purposes. I checked the check-box, and made moves to open the file, and, wouldn't you know?, ...the Disk Full message came up in its "!" box, as per usual for a file this large on my system (714 MB).s_reynisson wrote: Did a small test with a 1.1gb file and it looks like TP uses the same amount of memory for write and read only access (peaks at 1,7gb on my 1gb system), but you could try that, ie. right click on the file in explorer and set the read only attribute.
I had also earlier opened a DOS terminal window and looked at the "attrib" settings of my large files. They all only have the "A" attribute bit set, where "A" is for archive. I even cleared the "A" attribute experimentally and tried to open the file with TP after that: still no-go!
I'll try that on Thursday at the Lab; thanks!Also see TP's help under "Command Line Parameters", there is one to open with read only access, -r.
The virus checker is always enabled, just as normally, when I open the 649 MB file successfully; my question was whether it would interfere with opening a 714 MB file, and why. But I'll try disabling the SOPHOS virus checker manana, too. I'll take a shot at a long shot!Some real-time virus checkers actually scan txt files when they are opend. Just silly IMHO but I've seen it with my own eyes... temporarily disabling it just to make sure sounds reasonable to me.
Thanks, your suggestions have been a great help; this is a learning experience for me (so far). I just don't want to give up on this.Since my long shots only get longer, perhaps someone else would care to jump in? ;)
Anyone else have ideas or experience defeating the "Disk Full" mssg.?
--Joe
"Trouble-shooting is LOOKING for trouble."
Disk Full message
Try to rename the file to <filename>.txt
I had the same problem and solved just renaming the file with the extension .txt
I had the same problem and solved just renaming the file with the extension .txt
Felipe Elias Balarin
Thank you, Felipe.
I tried your suggestion, but it does not work for me, with a 1.9 GB
text file. I renamed the file extension to "txt", but I still get the
"Disk Full" message, and the file does not load.
Do you understand why the "fix" works for you? Is there anything
particular about the kind(s) of filename extensions that TextPad "likes",
and why it should not like large files bigger than a certain size if they are
not named "*.txt"? I have no good theory about it.
I'll try removing the filename extension entirely.
If you try that, are you able to open a file that did not open with a
non "txt" extension before? In other words, can you open files with
*no* extension as well as you can open files with "txt" filename
extensions?
If so, I think this will be a difficult one for the software authors to
explain. Or at least interesting!
I'm sorry I cannot make your suggestion work. Thank you again!
--Joe / Tucson, Arizona / USA
> Try to rename the file to <filename>.txt
> Felipe
I tried your suggestion, but it does not work for me, with a 1.9 GB
text file. I renamed the file extension to "txt", but I still get the
"Disk Full" message, and the file does not load.
Do you understand why the "fix" works for you? Is there anything
particular about the kind(s) of filename extensions that TextPad "likes",
and why it should not like large files bigger than a certain size if they are
not named "*.txt"? I have no good theory about it.
I'll try removing the filename extension entirely.
If you try that, are you able to open a file that did not open with a
non "txt" extension before? In other words, can you open files with
*no* extension as well as you can open files with "txt" filename
extensions?
If so, I think this will be a difficult one for the software authors to
explain. Or at least interesting!
I'm sorry I cannot make your suggestion work. Thank you again!
--Joe / Tucson, Arizona / USA
> Try to rename the file to <filename>.txt
> Felipe
-
- Posts: 11
- Joined: Thu Aug 14, 2003 6:11 pm
- Location: Frankfort, Kentucky
- Contact:
file just too big for Textpad
I am having the same problem. I could have sworn the old versions of Textpad opened my large files just fine, but I am trying for the first time since upgrading to version 5, and at first I received your error message, and now Textpad just grinds and grinds trying to open the file. I finally just cancelled it.
This is pretty bad news for me. I've used Textpad for something like 5-7 years now as my text editor, but I really must have the ability to load large files fairly quickly.
This is pretty bad news for me. I've used Textpad for something like 5-7 years now as my text editor, but I really must have the ability to load large files fairly quickly.
With reference to the Disk full error.
This is dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.
http://www.microsoft.com/technet/sysint ... efrag.mspx
In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.
You could also try increasing the amount of available virtual memory as follows:
From the Start menu, choose:
1. Settings
2. Control panel
3. System
4. Advanced
5. Performance / Settings
6. Performance Options / Advanced
7. Virtual Memory / Change
I hope this helps.
This is dependent on the amount contiguous virtual memory allocated on the hard drive. It may be necessary to defragment it. Please see the following URL, for further information.
http://www.microsoft.com/technet/sysint ... efrag.mspx
In order to maximize the amount of contiguous virtual memory, you should run the normal disk defragmenter before and after running page defrag for system files.
You could also try increasing the amount of available virtual memory as follows:
From the Start menu, choose:
1. Settings
2. Control panel
3. System
4. Advanced
5. Performance / Settings
6. Performance Options / Advanced
7. Virtual Memory / Change
I hope this helps.
Last edited by bbadmin on Thu May 10, 2007 9:37 am, edited 2 times in total.
http://www.sysinternals.com/ntw2k/freew ... frag.shtml
gives a 404 not found.
http://www.microsoft.com/technet/sysint ... efrag.mspx
works for me ...
gives a 404 not found.
http://www.microsoft.com/technet/sysint ... efrag.mspx
works for me ...
Please also refer to the "Disk Full Message" thread in the "Known Problems" forum.
These two threads are discussing the same issue and there are some differences of opinion concerning the diagnosis. A number of users do not believe that virtual memory capacity, fragmented or not, has anything to do with the problem.
I do note from my own experiments that when a file is larger than some particular size, TextPad 5.0.3 returns the Disk Full error immediately; it makes no attempt to use-up the available memory. With a file below some particular size, TP503 takes a long time to open it, eating-up the memory as it goes.
Can somebody on your software team please look into this an confirm that there is a problem ?
These two threads are discussing the same issue and there are some differences of opinion concerning the diagnosis. A number of users do not believe that virtual memory capacity, fragmented or not, has anything to do with the problem.
I do note from my own experiments that when a file is larger than some particular size, TextPad 5.0.3 returns the Disk Full error immediately; it makes no attempt to use-up the available memory. With a file below some particular size, TP503 takes a long time to open it, eating-up the memory as it goes.
Can somebody on your software team please look into this an confirm that there is a problem ?
When opening a file, TextPad simply asks the operating system to allocate an amount of virtual memory as big as the file, in one contiguous chunk, so that it can memory map it. If the operating system declines the request, TextPad displays the disk full message. Memory mapping provides significant performance advantages compared with allocating memory on the heap, but can have this disadvantage for very big files. Note that the OS shadows all physical memory allocations in VM, so even if you have more RAM than the size of a file, the allocation will still fail if there is not sufficient contiguous VM.
When the OS is first installed, it will allocate what it thinks is enough VM, knowing that it can allocate more if needed. However, subsequent allocations will not be contiguous with the first one, so the VM becomes fragmented, and the built-in defragmenter does not defragment virtual memory.
Another factor that seems to affect this is bad sectors on the hard drive. These are transparently relocated when the drive is formatted, but the corresponding head movement would make it inefficient to allocate contiguous VM across those sectors. Hence, even with a hard drive that's only used for VM, the VM may be fragmented.
This is a performance trade off that works for the vast majority of files that TextPad is asked to edit, but there will always be exceptions, due to operating system limitations.
- Helios Software Solutions
When the OS is first installed, it will allocate what it thinks is enough VM, knowing that it can allocate more if needed. However, subsequent allocations will not be contiguous with the first one, so the VM becomes fragmented, and the built-in defragmenter does not defragment virtual memory.
Another factor that seems to affect this is bad sectors on the hard drive. These are transparently relocated when the drive is formatted, but the corresponding head movement would make it inefficient to allocate contiguous VM across those sectors. Hence, even with a hard drive that's only used for VM, the VM may be fragmented.
This is a performance trade off that works for the vast majority of files that TextPad is asked to edit, but there will always be exceptions, due to operating system limitations.
- Helios Software Solutions