[Pw_forum] Can anyone explain my errors in vasp

Sanjeev Kumar Gupta sanjeev0302 at rediffmail.com
Thu Jan 11 05:00:21 CET 2007


  
Dear Daya,
This is forum for Quantum-Expresso user. Sorry, for your help, but if you search on google for VASP, then you can also find porum for search. It is avilable.
Regards
s K Gupta

On Thu, 11 Jan 2007 Axel Kohlmeyer wrote :
>daya sagar wrote:
>>Hi,
>>
>>   My name is Dayasagar. I have been using VASP to do abinitio DFT
>
>please note, that this is a forum for the quantum espresso package,
>see http://www.quantum-espresso.org/ (you might want to check it out). ;-)
>
>>simulations of GaN NWs with 225 eV encut and vacuum = 2.6nm along X and Y directions. Can anyone please explain me the following error.
>>My Run went on till 37th ionic iteration and showed the following error.
>>-------------------------------------------------------------------------------------------------------------------
>>DAV:  35    -0.439193918571E+05    0.19144E+06   -0.24729E+05 19376   0.953E+02    0.234E+02
>>DAV:  36    -0.274732624636E+06   -0.23081E+06   -0.27733E+06 16696   0.116E+04    0.196E+02
>>DAV:  37    -0.972624721307E+05    0.17747E+06   -0.31438E+05 19200   0.120E+03    0.255E+02
>
>here's the origin of your problem (and it has nothing to do with VASP):
>
>>forrtl: Permission denied
>>getRegFromUnwindContext: Can't get Gr0 from UnwindContext, using 0
>>forrtl: severe (9): permission to access file denied, unit 8, file
>
>well, it looks as it you are either running out disk quota or
>your write access to your disk has been disabled.
>
>axel.
>
>
>>/m/utl0214/daya/gan/nanowires/100-axis/44ga-56n-dia2.0nm/vacuum-30/OUTCAR
>>Image              PC                Routine            Line        Source
>>vasp.4.6.26_ipf_m  4000000000727E80  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  4000000000726F90  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  40000000006BB120  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  400000000062DAC0  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  400000000062EA30  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  400000000064CC00  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  4000000000618340  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  40000000004F29B0  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  400000000007CCF0  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  4000000000005DD0  Unknown               Unknown  Unknown
>>libc.so.6.1        2000000000DF3430  Unknown               Unknown  Unknown
>>vasp.4.6.26_ipf_m  40000000000057C0  Unknown               Unknown  Unknown
>>FATAL ERROR on MPI node 4 (ipf232): GM send to MPI node 0 (ipf235 [00:60:dd:48:cb:55]) failed: status 17 (target port was closed) the peer process has not started, has exited or is dead
>>Small/Ctrl message completion error!
>>mpiexec: Warning: accept_abort_conn: MPI_Abort from IP 10.16.1.84, killing all.
>>mpiexec: killall: caught signal 15 (Terminated).
>>mpiexec: kill_tasks: killing all tasks.
>>wait_tasks: waiting for ipf235 ipf235 ipf234 ipf234 ipf232 ipf232 ipf231 ipf231
>>FATAL ERROR on MPI node 2 (ipf234): GM send to MPI node 0 (ipf235 [00:60:dd:48:cb:55]) failed: status 17 (target port was closed) the peer process has not started, has exited or is dead
>>Small/Ctrl message completion error!
>>mpiexec: process_kill_event: evt 18 task 0 on ipf235.
>>mpiexec: process_kill_event: evt 19 task 1 on ipf235.
>>mpiexec: process_kill_event: evt 20 task 2 on ipf234.
>>mpiexec: process_kill_event: evt 24 task 6 on ipf231.
>>mpiexec: process_kill_event: evt 25 task 7 on ipf231.
>>mpiexec: process_kill_event: evt 21 task 3 on ipf234.
>>mpiexec: process_obit_event: evt 14 task 6 on ipf231 stat 265.
>>mpiexec: process_obit_event: evt 16 task 7 on ipf231 stat 265.
>>mpiexec: process_kill_event: evt 22 task 4 on ipf232.
>>mpiexec: process_kill_event: evt 23 task 5 on ipf232.
>>wait_tasks: waiting for ipf235 ipf235 ipf234 ipf234 ipf232 ipf232
>>mpiexec: process_obit_event: evt 12 task 4 on ipf232 stat 255.
>>mpiexec: process_obit_event: evt 17 task 5 on ipf232 stat 265.
>>mpiexec: process_obit_event: evt 13 task 2 on ipf234 stat 265.
>>mpiexec: process_obit_event: evt 15 task 3 on ipf234 stat 265.
>>wait_tasks: waiting for ipf235 ipf235
>>--------------------------------------------------------------------------------------------------------------------
>>
>>Thanks.
>>
>>Regards,
>>Daya
>>
>>
>>Send free SMS to your Friends on Mobile from your Yahoo! Messenger. Download Now! http://messenger.yahoo.com/download.php
>>
>
>-- =======================================================================
>Axel Kohlmeyer   akohlmey at cmm.chem.upenn.edu   http://www.cmm.upenn.edu
>   Center for Molecular Modeling   --   University of Pennsylvania
>Department of Chemistry, 231 S.34th Street, Philadelphia, PA 19104-6323
>tel: 1-215-898-1582,  fax: 1-215-573-6233,  office-tel: 1-215-898-5425
>=======================================================================
>If you make something idiot-proof, the universe creates a better idiot.
>_______________________________________________
>Pw_forum mailing list
>Pw_forum at pwscf.org
>http://www.democritos.it/mailman/listinfo/pw_forum


====================================================
Sanjeev Kumar Gupta
Junior Research Fellow (DAE-BRNS)
Computational Condensed Matter Physics Lab.(CCMP)
Department of Physics, Faculty of Science,
The M.S.University of Baroda, Vadodara - 390 002.
Ph.No: +91-265-279 5339 (O) extn: 30-25 
mobile:09374616019
Email: sanjeev0302 at rediffmail.com
       sanjeev0302 at yahoo.co.in 
       skgupta-phy at msubaroda.ac.in
==================================================== 

Residential Address- 

Dr. Vikram Sarabhai Hall, 
Room No.-95, Boys' Hostel,M.S.University Campus, 
Pratapgunj,Vadodara-390 002,Gujarat,INDIA 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20070111/5f246dd1/attachment.html>


More information about the users mailing list