Merge lp:~garsua/siesta/trunk-elsi-dm into lp:siesta

Proposed by Victor GS
Status: Superseded
Proposed branch: lp:~garsua/siesta/trunk-elsi-dm
Merge into: lp:siesta
Diff against target: 4436 lines (+3744/-70) (has conflicts)
32 files modified
Docs/siesta.tex (+119/-2)
Src/MPI/mpi_siesta.F90 (+18/-1)
Src/Makefile (+26/-18)
Src/compute_dm.F (+25/-1)
Src/dhscf.F (+10/-0)
Src/fold_auxcell.f90 (+85/-0)
Src/fsiesta_mpi.F90 (+4/-0)
Src/initparallel.F (+1/-1)
Src/m_elsi_interface.F90 (+1601/-0)
Src/m_ncdf_siesta.F90 (+3/-0)
Src/m_pexsi_dos.F90 (+1/-1)
Src/m_pexsi_driver.F90 (+1/-1)
Src/m_pexsi_local_dos.F90 (+1/-1)
Src/m_redist_spmatrix.F90 (+187/-21)
Src/parallel.F (+1/-1)
Src/post_scf_work.F (+33/-2)
Src/read_options.F90 (+11/-0)
Src/siesta.F (+9/-9)
Src/siesta_forces.F90 (+30/-5)
Src/siesta_init.F (+23/-5)
Src/siesta_options.F90 (+1/-0)
Src/siesta_tddft.F90 (+1/-1)
Src/state_init.F (+4/-0)
Src/write_subs.F (+1/-0)
Tests/si-quantum-dot/coords.Si987H372.fdf (+1368/-0)
Tests/si-quantum-dot/makefile (+7/-0)
Tests/si-quantum-dot/si-quantum-dot.fdf (+39/-0)
Tests/si-quantum-dot/si-quantum-dot.pseudos (+1/-0)
Tests/sih-elsi/makefile (+6/-0)
Tests/sih-elsi/sih-elsi.fdf (+121/-0)
Tests/sih-elsi/sih-elsi.pseudos (+1/-0)
version.info (+5/-0)
Text conflict in Src/siesta_forces.F90
Text conflict in version.info
To merge this branch: bzr merge lp:~garsua/siesta/trunk-elsi-dm
Reviewer Review Type Date Requested Status
Alberto Garcia Needs Resubmitting
Review via email: mp+361820@code.launchpad.net

Commit message

Cosmetic changes on m_elsi_interface.F90
- Removed "use fdf, only: fdf_get" from elsi_real_solver and elsi_complex_solver subroutines
- Unindented lines from 401 to 510 (in elsi_real_solver) and from 1351 to 1460 (in elsi_complex_solver)

To post a comment you must log in.
Revision history for this message
Alberto Garcia (albertog) wrote :

Victor: As a test of the MR process it is fine, but you should specify the target branch as
~albertog/siesta/trunk-elsi-dm

review: Needs Resubmitting
Revision history for this message
Victor GS (garsua) wrote :

Gracias Alberto, se me pasó incluirlo. Vuelvo a hacer el merge request, a
ver si ahora sale bien.

On Wed, Jan 16, 2019 at 3:34 PM Alberto Garcia <email address hidden> wrote:

> Review: Resubmit
>
> Victor: As a test of the MR process it is fine, but you should specify the
> target branch as
> ~albertog/siesta/trunk-elsi-dm
>
> --
> https://code.launchpad.net/~garsua/siesta/trunk-elsi-dm/+merge/361820
> You are the owner of lp:~garsua/siesta/trunk-elsi-dm.
>

Unmerged revisions

696. By Victor GS

Cosmetic changes on m_elsi_interface.F90
- Removed "use fdf, only: fdf_get" from elsi_real_solver and elsi_complex_solver subroutines
- Unindented lines from 401 to 510 (in elsi_real_solver) and from 1351 to 1460 (in elsi_complex_solver)

695. By Alberto Garcia

Do not get the EDM during the SCF cycle + NTPoly support

* In compute_dm, the call to the ELSI ('get_dm') interface uses a
flag to disable the generation of the EDM matrix. In this case
only the DM is computed.

After the scf loop, in 'post_scf_work', a further call to the ELSI
interface requests (only) the EDM.

Note that this is achieved through a logical flag 'Get_EDM_Only',
which is not really appropriate. It would be better to have a
more explicit flag, such as:

           output_mode="DM|EDM"

The ELSI shutdown has been moved towards the end of 'siesta_forces',
and the 'Siesta_Worker' logic preserved in that routine (even though
it is not working elsewhere).

* NTpoly support has been added to the interface

(Thanks to Victor M. Garcia-Suarez and Victor Yu)

694. By Alberto Garcia

Wrap MPI check on number of procs for ELSI solver

693. By Alberto Garcia

Add Si quantum dot test case for ELSI

692. By Alberto Garcia

Fix bug in computation of Delta-V for PEXSI mu bracketing

A new array is needed to properly update the value of the
change in potential.

(+ added a declaration for 'date_stamp' in the auxiliary
code for version compatibility)

691. By Alberto Garcia

Change the spin-reduction convention for entropy

For ELSI versions after 2018-08-17, there is no need to reduce over spins
when computing the entropy.

To use older versions (such as the 2.0.2 release), use the preprocessor symbol

    SIESTA__ELSI__OLD_SPIN_CONVENTION

If the version is 'old', but the above symbol is not used, a compilation error will
be triggered. (This works by means of a call to 'elsi_get_datestamp', which was added just
around the same time as the spin-reduction-convention change: Aug 17 vs Aug 21, 2018.
ELSI versions in between will not work without further intervention.

690. By Alberto Garcia

Update documentation. Check that Nodes is a multiple of nk*nspin

689. By Alberto Garcia

Fix weight of DM and EDM. Working

688. By Alberto Garcia

Do not make shadow copies of elsi_global_comm

687. By Alberto Garcia

Change some ptrs to allocs. Fix communicators

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'Docs/siesta.tex'
--- Docs/siesta.tex 2019-01-04 07:01:47 +0000
+++ Docs/siesta.tex 2019-01-16 11:00:58 +0000
@@ -952,6 +952,33 @@
952 LIBS += -Wl,--allow-multiple-definition -lpexsi_linux <>952 LIBS += -Wl,--allow-multiple-definition -lpexsi_linux <>
953\end{shellexample}953\end{shellexample}
954954
955 \item[\href{http://elsi-interchange.org}{ELSI}]%
956 \index{ELSI}%
957 \index{External library!ELSI}%
958 The ELSI library may be used with \siesta\ for access to a range of
959 solvers, see Sec.~\ref{SolverELSI}. Currently the interface is
960 implemented (tested) as of ELSI version 2.0.2. If
961 newer versions retain the same interface they may also be used.
962
963 Add these flags to your \file{arch.make} file to enable ELSI
964\begin{shellexample}
965 INCFLAGS += -I/opt/elsi/include
966 LIBS += -L/opt/elsi/lib <>
967 FPPFLAGS += -DSIESTA__ELSI
968\end{shellexample}
969 \index{compile!pre-processor!-DSIESTA\_\_ELSI}
970 where \shell{<>} represents the appropriate ELSI libraries (see the
971 ELSI installation instructions).
972 %
973 If one experiences linker failures, one possible solution that may
974 help is
975\begin{shellexample}
976 LIBS += -lmpi_cxx -lstdc++
977\end{shellexample}
978 which is due to ELSI containing some parts written in C++, while the
979 linking is done by the Fortran compiler. The exact library name for
980 your MPI vendor may vary.
981
955982
956 \item[CheSS]%983 \item[CheSS]%
957 \index{CheSS}%984 \index{CheSS}%
@@ -6021,6 +6048,11 @@
6021of the density matrix via an efficient scheme of selected6048of the density matrix via an efficient scheme of selected
6022inversion (see Sec~\ref{SolverPEXSI}).6049inversion (see Sec~\ref{SolverPEXSI}).
60236050
6051Additionally, \siesta\ can use the ELSI library (see
6052Sec.~\ref{SolverELSI}) of solvers (which provides versions of OMM
6053and PEXSI, in addition to diagonalization), and the new linear-scaling
6054CheSS library (see Sec.~\ref{SolverCheSS})
6055
6024The calculation of the H and S matrix elements is always done with an6056The calculation of the H and S matrix elements is always done with an
6025O(N) method. The actual scaling is not linear for small systems, but6057O(N) method. The actual scaling is not linear for small systems, but
6026it becomes O(N) when the system dimensions are larger than the scale6058it becomes O(N) when the system dimensions are larger than the scale
@@ -6044,8 +6076,8 @@
6044 Character string to choose among diagonalization (\fdf*{diagon}),6076 Character string to choose among diagonalization (\fdf*{diagon}),
6045 cubic-scaling minimization (\fdf*{OMM}), Order-N (\fdf*{OrderN})6077 cubic-scaling minimization (\fdf*{OMM}), Order-N (\fdf*{OrderN})
6046 solution of the Kohn-Sham Hamiltonian, \fdf*{transiesta}, the6078 solution of the Kohn-Sham Hamiltonian, \fdf*{transiesta}, the
6047 PEXSI method (\fdf*{PEXSI}) or the \fdf*{CheSS} solver.6079 PEXSI method (\fdf*{PEXSI}), ELSI solver (\fdf*{ELSI}) or the \fdf*{CheSS} solver.
6048 6080
6049\end{fdfentry}6081\end{fdfentry}
60506082
60516083
@@ -6867,6 +6899,91 @@
68676899
6868\end{fdflogicalF}6900\end{fdflogicalF}
68696901
6902\subsection{The ELSI solver family}
6903\label{SolverELSI}
6904\index{ELSI solvers}
6905
6906The ELSI library provides a unified interface to a range of solvers,
6907including ELPA (diagonalization), PEXSI, OMM, and SIPS (based on spectrum
6908slicing). It can be downloaded and installed freely from
6909\url{http://elsi-interchange.org}.
6910
6911The current interface in \siesta\ is based on the ``DM'' mode in ELSI:
6912sparse H and S matrices are passed to the library, and sparse DM and
6913EDM matrices are returned. Internally, the library performs any extra
6914operations needed. K-points and/or spin-polarization (in collinear
6915form) are also supported, in fully parallel mode. This means that
6916k-points/spin-polarized calculations will be carried out splitting the
6917communicator for orbital distribution into nk$\times$nspin copies,
6918where nk is the number of k-points and nspin the number of spin
6919components. The total number of MPI processors must be a multiple of
6920nk$\times$nspin.
6921
6922See Sec.~\ref{sec:libs} for details on installing \siesta\ with ELSI support.
6923
6924\subsubsection{Input parameters}
6925
6926The most important parameter is the solver to be used. Most other
6927options are given default values by the program which are nearly
6928always appropriate.
6929
6930\begin{fdfentry}{ELSI!Solver}[string]<ELPA>
6931 A specification of the solver wanted. Possible values are
6932 \begin{itemize}
6933 \item ELPA
6934 \item OMM
6935 \item PEXSI
6936 \item SIPS
6937 \end{itemize}
6938\end{fdfentry}
6939
6940\begin{fdfentry}{ELSI!Broadening!Method}[string]<''fermi''>
6941The scheme for level broadening. Valid options are ``fermi'',
6942``gauss'', ``mp'' (Methfessel-Paxton), ``cold''
6943(Marzari-Vanderbilt cold smearing).
6944\end{fdfentry}
6945
6946\begin{fdfentry}{ELSI!Output!Level}[integer]<0>
6947 The level of detail provided in the output. Possible values are 0, 1,
6948 2, and 3.
6949\end{fdfentry}
6950
6951\begin{fdfentry}{ELSI!Output!Json}[integer]<1>
6952 If not 0, a separate log file in JSON format will be written out.
6953\end{fdfentry}
6954
6955Other fdf options are mapped in a direct way to the options described in the
6956ELSI manual:
6957
6958\begin{fdfentry}{ELSI!Broadening!MPOrder}[integer]<1>
6959\end{fdfentry}
6960
6961\begin{fdfentry}{ELSI!ELPA!Flavor}[integer]<2>
6962\end{fdfentry}
6963
6964\begin{fdfentry}{ELSI!OMM!Flavor}[integer]<0>
6965\end{fdfentry}
6966
6967\begin{fdfentry}{ELSI!OMM!ELPA!Steps}[integer]<3>
6968\end{fdfentry}
6969
6970\begin{fdfentry}{ELSI!OMM!Tolerance}[real]<10e-9>
6971\end{fdfentry}
6972
6973\begin{fdfentry}{ELSI!PEXSI!TasksPerPole}[integer]< {\bf no default}>
6974 See the discussion on the three-level PEXSI parallelism in the ELSI manual.
6975\end{fdfentry}
6976
6977\begin{fdfentry}{ELSI!PEXSI!TasksSymbolic}[integer]<1>
6978\end{fdfentry}
6979
6980\begin{fdfentry}{ELSI!SIPS!Slices}[integer]<{\bf no default}>
6981 See the discussion in the ELSI manual.
6982\end{fdfentry}
6983
6984\begin{fdfentry}{ELSI!SIPS!ELPA!Steps}[integer]<2>
6985\end{fdfentry}
6986
68706987
6871\subsection{The CheSS solver}6988\subsection{The CheSS solver}
6872\label{SolverCheSS}6989\label{SolverCheSS}
68736990
=== modified file 'Src/MPI/mpi_siesta.F90'
--- Src/MPI/mpi_siesta.F90 2016-09-16 12:25:04 +0000
+++ Src/MPI/mpi_siesta.F90 2019-01-16 11:00:58 +0000
@@ -37,12 +37,29 @@
37 USE MPI__INCLUDE, true_MPI_Comm_World => MPI_Comm_World37 USE MPI__INCLUDE, true_MPI_Comm_World => MPI_Comm_World
38#endif /* NO_MPI_INTERFACES */38#endif /* NO_MPI_INTERFACES */
3939
40 ! The reason to define true_MPI_Comm_World as a pointer to the
41 ! (inmutable) MPI_Comm_World parameter in the MPI module is to keep
42 ! it as reference. The MPI_Comm_World *variable* defined below can
43 ! be redefined at will. This practice is needed because most Siesta
44 ! routines have hard-wired references to MPI_Comm_World as their
45 ! communicator. Instead of generalizing the interfaces and pass the
46 ! communicator as an argument (work in progress), we use the kludge
47 ! of re-defining MPI_Comm_World.
48
40! The following construction allows to supplant MPI_Comm_World within SIESTA,49! The following construction allows to supplant MPI_Comm_World within SIESTA,
41! and to use it as a subroutine with its own internal MPI communicator.50! and to use it as a subroutine with its own internal MPI communicator.
4251
52 ! Siesta-instances should refer to MPI_Comm_DFT
53 ! This is the case, for example, when a driver program splits the
54 ! global communicator to dispatch several simultaneous Siesta instances
55 ! Typically, comm_world = comm_dft, but not for calculations (PEXSI, ELSI)
56 ! for which only a subset of processors carry out the core Siesta operations
57 ! that have hard-wired references to mpi_comm_world
58
43 integer, public :: MPI_Comm_World = true_MPI_Comm_World59 integer, public :: MPI_Comm_World = true_MPI_Comm_World
60 integer, public :: MPI_Comm_DFT = true_MPI_Comm_World
4461
45 public :: true_MPI_Comm_World62 public :: true_MPI_Comm_World
4663
4764
48#ifdef GRID_SP65#ifdef GRID_SP
4966
=== modified file 'Src/Makefile'
--- Src/Makefile 2018-12-06 09:30:22 +0000
+++ Src/Makefile 2019-01-16 11:00:58 +0000
@@ -156,6 +156,8 @@
156 write_orb_indx.o \156 write_orb_indx.o \
157 die.o m_pexsi.o m_pexsi_driver.o m_pexsi_dos.o m_pexsi_local_dos.o\157 die.o m_pexsi.o m_pexsi_driver.o m_pexsi_dos.o m_pexsi_local_dos.o\
158 m_chess.o \158 m_chess.o \
159 m_elsi_interface.o \
160 fold_auxcell.o \
159 m_redist_spmatrix.o \161 m_redist_spmatrix.o \
160 class_Distribution.o\162 class_Distribution.o\
161 m_dminim.o m_zminim.o \163 m_dminim.o m_zminim.o \
@@ -293,7 +295,7 @@
293 "FFLAGS=$(FFLAGS:$(IPO_FLAG)=)" module )295 "FFLAGS=$(FFLAGS:$(IPO_FLAG)=)" module )
294296
295#--------------------------------------------------------------297#--------------------------------------------------------------
296MS=MatrixSwitch.a298MS?=MatrixSwitch.a
297$(MS): $(MPI_INTERFACE)299$(MS): $(MPI_INTERFACE)
298 (cd MatrixSwitch/src ; \300 (cd MatrixSwitch/src ; \
299 $(MAKE) -j 1 \301 $(MAKE) -j 1 \
@@ -641,11 +643,11 @@
641class_TriMat.o: alloc.o intrinsic_missing.o643class_TriMat.o: alloc.o intrinsic_missing.o
642coceri.o: files.o periodic_table.o precision.o units.o644coceri.o: files.o periodic_table.o precision.o units.o
643compute_dm.o: atomlist.o class_SpData1D.o compute_ebs_shift.o diagon.o645compute_dm.o: atomlist.o class_SpData1D.o compute_ebs_shift.o diagon.o
644compute_dm.o: iodmhs_netcdf.o kpoint_scf.o m_chess.o m_dminim.o m_energies.o646compute_dm.o: iodmhs_netcdf.o kpoint_scf.o m_chess.o m_dminim.o
645compute_dm.o: m_eo.o m_hsx.o m_pexsi_driver.o m_rmaxh.o m_spin.o m_steps.o647compute_dm.o: m_elsi_interface.o m_energies.o m_eo.o m_hsx.o m_pexsi_driver.o
646compute_dm.o: m_transiesta.o m_ts_global_vars.o m_zminim.o normalize_dm.o648compute_dm.o: m_rmaxh.o m_spin.o m_steps.o m_transiesta.o m_ts_global_vars.o
647compute_dm.o: ordern.o parallel.o precision.o siesta_geom.o siesta_options.o649compute_dm.o: m_zminim.o normalize_dm.o ordern.o parallel.o precision.o
648compute_dm.o: sparse_matrices.o sys.o units.o650compute_dm.o: siesta_geom.o siesta_options.o sparse_matrices.o sys.o units.o
649compute_ebs_shift.o: m_mpi_utils.o parallel.o precision.o651compute_ebs_shift.o: m_mpi_utils.o parallel.o precision.o
650compute_energies.o: atomlist.o class_SpData1D.o class_SpData2D.o652compute_energies.o: atomlist.o class_SpData1D.o class_SpData2D.o
651compute_energies.o: class_SpData2D.o dhscf.o files.o m_dipol.o m_energies.o653compute_energies.o: class_SpData2D.o dhscf.o files.o m_dipol.o m_energies.o
@@ -683,11 +685,12 @@
683dfscf.o: meshphi.o parallel.o parallelsubs.o precision.o sys.o685dfscf.o: meshphi.o parallel.o parallelsubs.o precision.o sys.o
684dhscf.o: alloc.o atmfuncs.o bsc_xcmod.o cellxc_mod.o delk.o dfscf.o686dhscf.o: alloc.o atmfuncs.o bsc_xcmod.o cellxc_mod.o delk.o dfscf.o
685dhscf.o: doping_uniform.o files.o forhar.o iogrid_netcdf.o m_charge_add.o687dhscf.o: doping_uniform.o files.o forhar.o iogrid_netcdf.o m_charge_add.o
686dhscf.o: m_efield.o m_hartree_add.o m_iorho.o m_iotddft.o m_mesh_node.o688dhscf.o: m_efield.o m_elsi_interface.o m_hartree_add.o m_iorho.o m_iotddft.o
687dhscf.o: m_ncdf_io.o m_ncdf_siesta.o m_partial_charges.o m_rhog.o m_spin.o689dhscf.o: m_mesh_node.o m_ncdf_io.o m_ncdf_siesta.o m_partial_charges.o m_rhog.o
688dhscf.o: m_ts_global_vars.o m_ts_hartree.o m_ts_options.o m_ts_voltage.o mesh.o690dhscf.o: m_spin.o m_ts_global_vars.o m_ts_hartree.o m_ts_options.o
689dhscf.o: meshdscf.o meshsubs.o moremeshsubs.o parallel.o parsing.o precision.o691dhscf.o: m_ts_voltage.o mesh.o meshdscf.o meshsubs.o moremeshsubs.o parallel.o
690dhscf.o: rhofft.o rhoofd.o siesta_options.o sys.o units.o vmat.o692dhscf.o: parsing.o precision.o rhofft.o rhoofd.o siesta_options.o sys.o units.o
693dhscf.o: vmat.o
691diag.o: alloc.o diag_option.o parallel.o precision.o sys.o694diag.o: alloc.o diag_option.o parallel.o precision.o sys.o
692diag2g.o: fermid.o intrinsic_missing.o parallel.o parallelsubs.o precision.o695diag2g.o: fermid.o intrinsic_missing.o parallel.o parallelsubs.o precision.o
693diag2g.o: sys.o696diag2g.o: sys.o
@@ -838,6 +841,8 @@
838m_dscfcomm.o: alloc.o parallel.o parallelsubs.o precision.o schecomm.o841m_dscfcomm.o: alloc.o parallel.o parallelsubs.o precision.o schecomm.o
839m_efield.o: atmfuncs.o m_cite.o m_ts_global_vars.o mesh.o parallel.o842m_efield.o: atmfuncs.o m_cite.o m_ts_global_vars.o mesh.o parallel.o
840m_efield.o: precision.o siesta_cml.o siesta_geom.o sys.o units.o843m_efield.o: precision.o siesta_cml.o siesta_geom.o sys.o units.o
844m_elsi_interface.o: alloc.o class_Distribution.o fold_auxcell.o m_mpi_utils.o
845m_elsi_interface.o: m_redist_spmatrix.o parallel.o precision.o units.o
841m_energies.o: precision.o846m_energies.o: precision.o
842m_eo.o: precision.o847m_eo.o: precision.o
843m_evolve.o: atomlist.o cranknic_evolg.o cranknic_evolk.o m_spin.o precision.o848m_evolve.o: atomlist.o cranknic_evolg.o cranknic_evolk.o m_spin.o precision.o
@@ -1194,8 +1199,9 @@
1194post_scf_work.o: atomlist.o class_Fstack_Pair_Geometry_SpData2D.o1199post_scf_work.o: atomlist.o class_Fstack_Pair_Geometry_SpData2D.o
1195post_scf_work.o: class_Geometry.o class_Pair_Geometry_SpData2D.o compute_dm.o1200post_scf_work.o: class_Geometry.o class_Pair_Geometry_SpData2D.o compute_dm.o
1196post_scf_work.o: diagon.o final_H_f_stress.o kpoint_scf.o m_dminim.o1201post_scf_work.o: diagon.o final_H_f_stress.o kpoint_scf.o m_dminim.o
1197post_scf_work.o: m_energies.o m_eo.o m_spin.o m_steps.o m_zminim.o mneighb.o1202post_scf_work.o: m_elsi_interface.o m_energies.o m_eo.o m_spin.o m_steps.o
1198post_scf_work.o: parallel.o siesta_geom.o siesta_options.o sparse_matrices.o1203post_scf_work.o: m_zminim.o mneighb.o parallel.o siesta_geom.o siesta_options.o
1204post_scf_work.o: sparse_matrices.o
1199print_spin.o: atomlist.o m_mpi_utils.o m_spin.o parallel.o precision.o1205print_spin.o: atomlist.o m_mpi_utils.o m_spin.o parallel.o precision.o
1200print_spin.o: siesta_cml.o sparse_matrices.o1206print_spin.o: siesta_cml.o sparse_matrices.o
1201printmatrix.o: alloc.o1207printmatrix.o: alloc.o
@@ -1288,11 +1294,12 @@
1288siesta_forces.o: atomlist.o class_Fstack_Data1D.o class_SpData2D.o compute_dm.o1294siesta_forces.o: atomlist.o class_Fstack_Data1D.o class_SpData2D.o compute_dm.o
1289siesta_forces.o: compute_energies.o compute_max_diff.o densematrix.o1295siesta_forces.o: compute_energies.o compute_max_diff.o densematrix.o
1290siesta_forces.o: dm_charge.o files.o final_H_f_stress.o flook_siesta.o1296siesta_forces.o: dm_charge.o files.o final_H_f_stress.o flook_siesta.o
1291siesta_forces.o: kpoint_scf.o m_check_walltime.o m_convergence.o m_energies.o1297siesta_forces.o: kpoint_scf.o m_check_walltime.o m_convergence.o
1292siesta_forces.o: m_forces.o m_initwf.o m_iodm_old.o m_mixing.o m_mixing_scf.o1298siesta_forces.o: m_elsi_interface.o m_energies.o m_forces.o m_initwf.o
1293siesta_forces.o: m_mpi_utils.o m_ncdf_siesta.o m_pexsi.o m_pexsi_driver.o1299siesta_forces.o: m_iodm_old.o m_mixing.o m_mixing_scf.o m_mpi_utils.o
1294siesta_forces.o: m_rhog.o m_spin.o m_steps.o m_stress.o m_transiesta.o1300siesta_forces.o: m_ncdf_siesta.o m_pexsi.o m_pexsi_driver.o m_rhog.o m_spin.o
1295siesta_forces.o: m_ts_charge.o m_ts_electype.o m_ts_global_vars.o m_ts_method.o1301siesta_forces.o: m_steps.o m_stress.o m_transiesta.o m_ts_charge.o
1302siesta_forces.o: m_ts_electype.o m_ts_global_vars.o m_ts_method.o
1296siesta_forces.o: m_ts_options.o mixer.o parallel.o post_scf_work.o precision.o1303siesta_forces.o: m_ts_options.o mixer.o parallel.o post_scf_work.o precision.o
1297siesta_forces.o: save_density_matrix.o scfconvergence_test.o setup_H0.o1304siesta_forces.o: save_density_matrix.o scfconvergence_test.o setup_H0.o
1298siesta_forces.o: setup_hamiltonian.o siesta_cml.o siesta_dicts.o siesta_geom.o1305siesta_forces.o: setup_hamiltonian.o siesta_cml.o siesta_dicts.o siesta_geom.o
@@ -1493,6 +1500,7 @@
1493m_final_h_f_stress.o: final_H_f_stress.o1500m_final_h_f_stress.o: final_H_f_stress.o
1494m_find_kgrid.o: find_kgrid.o1501m_find_kgrid.o: find_kgrid.o
1495m_fire_optim.o: fire_optim.o1502m_fire_optim.o: fire_optim.o
1503m_fold_auxcell.o: fold_auxcell.o
1496m_forhar.o: forhar.o1504m_forhar.o: forhar.o
1497m_get_kpoints_scale.o: get_kpoints_scale.o1505m_get_kpoints_scale.o: get_kpoints_scale.o
1498m_gradient.o: gradient.o1506m_gradient.o: gradient.o
14991507
=== modified file 'Src/compute_dm.F'
--- Src/compute_dm.F 2018-11-09 09:36:46 +0000
+++ Src/compute_dm.F 2019-01-16 11:00:58 +0000
@@ -36,6 +36,9 @@
36#ifdef SIESTA__PEXSI36#ifdef SIESTA__PEXSI
37 use m_pexsi_solver, only: pexsi_solver37 use m_pexsi_solver, only: pexsi_solver
38#endif38#endif
39#ifdef SIESTA__ELSI
40 use m_elsi_interface, only: elsi_getdm
41#endif
39 use m_hsx, only : write_hs_formatted42 use m_hsx, only : write_hs_formatted
40#ifdef MPI43#ifdef MPI
41 use mpi_siesta44 use mpi_siesta
@@ -75,7 +78,7 @@
75 if (SIESTA_worker) call timer( 'compute_dm', 1 )78 if (SIESTA_worker) call timer( 'compute_dm', 1 )
7679
77#ifdef MPI80#ifdef MPI
78 call MPI_Bcast(isolve,1,MPI_integer,0,true_MPI_Comm_World,mpierr)81 call MPI_Bcast(isolve,1,MPI_integer,0,MPI_Comm_DFT,mpierr)
79#endif 82#endif
8083
81 if (SIESTA_worker) then84 if (SIESTA_worker) then
@@ -137,6 +140,27 @@
137 endif140 endif
138 if (.not. SIESTA_worker) RETURN141 if (.not. SIESTA_worker) RETURN
139#endif142#endif
143#ifdef SIESTA__ELSI
144 if (isolve .eq. SOLVE_ELSI) then
145 ! This test done in node 0 since NonCol and SpOrb
146 ! are not set for ELSI-solver-only processes
147 if (ionode) then
148 if (spin%NCol .or. spin%SO) call die(
149 $ "The ELSI solver does not implement "//
150 $ "non-coll spins or Spin-orbit yet")
151 endif
152 call elsi_getdm(iscf, no_s, spin%spinor,
153 & no_l, maxnh, no_u,
154 & numh, listhptr, listh,
155 & H, S, qtot, temp,
156 & xijo,
157 & kpoint_scf%N, kpoint_scf%k, kpoint_scf%w,
158 & eo, qo, Dscf, Escf, ef, Entropy,
159 & occtol, neigwanted, Get_EDM_Only=.false.)
160 Ecorrec = 0.0_dp
161 endif
162 if (.not. SIESTA_worker) RETURN
163#endif
140 ! Here we decide if we want to calculate one or more SCF steps by164 ! Here we decide if we want to calculate one or more SCF steps by
141 ! diagonalization before proceeding with the OMM routine165 ! diagonalization before proceeding with the OMM routine
142 CallDiagon=.false.166 CallDiagon=.false.
143167
=== modified file 'Src/dhscf.F'
--- Src/dhscf.F 2019-01-04 07:01:47 +0000
+++ Src/dhscf.F 2019-01-16 11:00:58 +0000
@@ -694,6 +694,10 @@
694 use m_ts_voltage, only: ts_voltage694 use m_ts_voltage, only: ts_voltage
695 use m_ts_hartree, only: ts_hartree_fix695 use m_ts_hartree, only: ts_hartree_fix
696696
697#ifdef SIESTA__ELSI
698 use m_elsi_interface, only: elsi_save_potential
699#endif
700
697 implicit none701 implicit none
698702
699 integer703 integer
@@ -1653,6 +1657,12 @@
1653#endif1657#endif
1654 endif1658 endif
16551659
1660#ifdef SIESTA__ELSI
1661 ! This routine, and the associated data, live
1662 ! in the ELSI interface module
1663 call elsi_save_potential(ntpl,nspin,Vscf,comm=MPI_Comm_World)
1664#endif
1665
1656C ----------------------------------------------------------------------1666C ----------------------------------------------------------------------
1657C Print vacuum level1667C Print vacuum level
1658C ----------------------------------------------------------------------1668C ----------------------------------------------------------------------
16591669
=== added file 'Src/fold_auxcell.f90'
--- Src/fold_auxcell.f90 1970-01-01 00:00:00 +0000
+++ Src/fold_auxcell.f90 2019-01-16 11:00:58 +0000
@@ -0,0 +1,85 @@
1module m_fold_auxcell
2 public :: fold_sparse_arrays
3 private
4 CONTAINS
5 subroutine fold_sparse_arrays(no_l,no_u,numh,listhptr,nnz,listh, &
6 numh_u,listhptr_u,nnz_u,listh_u,ind2ind_u)
7
8! Fold-in a sparse matrix from the auxiliary supercell into the
9! unit cell (see picture)
10
11 ! Number of orbitals handled by this node
12 integer, intent(in) :: no_l
13 ! Number of orbitals in unit cell
14 integer, intent(in) :: no_u
15
16 ! Number of interactions per orbital; pointer to beginning of sparse array data
17 integer, intent(in) :: numh(no_l), listhptr(no_l)
18 ! Total number of interactions
19 integer, intent(in) :: nnz
20 ! Columns of the (rectangular) supercell-based matrix
21 integer, intent(in) :: listh(nnz)
22
23 ! Output: All interactions are collapsed to the unit cell orbitals
24
25 ! Number of interactions per orbital; pointer to beginning of sparse array data
26 integer, intent(out) :: numh_u(no_l), listhptr_u(no_l)
27 ! Total number of interactions
28 integer, intent(out) :: nnz_u
29 ! Columns of the (square) unit cell-based matrix
30 integer, allocatable, intent(out) :: listh_u(:)
31 ! Mapper of indexes from supercell-sparse-array to unit-cell sparse array
32 integer, allocatable, intent(out) :: ind2ind_u(:)
33
34
35 ! Local variables
36 integer :: iuo, ind, j, jo, j_u, ind_u, juo
37 integer, allocatable :: mask(:)
38
39 allocate(mask(no_u))
40
41 ! Find out how many "folded" interactions there are
42 nnz_u = 0
43 do iuo = 1,no_l
44 mask = 0
45 do j = 1,numh(iuo)
46 ind = listhptr(iuo) + j
47 jo = listh(ind)
48 juo = mod(jo-1,no_u) + 1
49 mask(juo) = 1 ! Found one
50 enddo
51 numh_u(iuo) = sum(mask)
52 nnz_u = nnz_u + numh_u(iuo)
53 enddo
54
55 allocate(listh_u(nnz_u))
56 allocate(ind2ind_u(nnz))
57
58 ! Generate folded pointer array
59 listhptr_u(1) = 0
60 do iuo = 2, no_l
61 listhptr_u(iuo) = listhptr_u(iuo-1) + numh_u(iuo-1)
62 enddo
63
64 ! Complete the mapping
65 do iuo = 1,no_l
66 mask(:) = 0
67 ind_u = listhptr_u(iuo)
68 ind = listhptr(iuo)
69 do j = 1,numh(iuo)
70 ind = ind + 1
71 jo = listh(ind)
72 juo = mod(jo-1,no_u) + 1
73 if (mask(juo) > 0) then
74 ! juo already seen and its place recorded
75 ind2ind_u(ind) = mask(juo)
76 else
77 ind_u = ind_u + 1
78 listh_u(ind_u) = juo
79 ind2ind_u(ind) = ind_u ! map
80 mask(juo) = ind_u ! record
81 endif
82 enddo
83 enddo
84 end subroutine fold_sparse_arrays
85end module m_fold_auxcell
086
=== modified file 'Src/fsiesta_mpi.F90'
--- Src/fsiesta_mpi.F90 2018-12-03 22:23:36 +0000
+++ Src/fsiesta_mpi.F90 2019-01-16 11:00:58 +0000
@@ -197,6 +197,7 @@
197 ! MPI_COMM_WORLD197 ! MPI_COMM_WORLD
198 use mpi_siesta, only: MPI_Comm_Siesta => MPI_Comm_World ! What siesta uses198 use mpi_siesta, only: MPI_Comm_Siesta => MPI_Comm_World ! What siesta uses
199 ! as MPI_Comm_World199 ! as MPI_Comm_World
200 use mpi_siesta, only: MPI_Comm_DFT ! Siesta-instance (dft-calc)
200 use mpi_siesta, only: MPI_Integer ! Integer data type201 use mpi_siesta, only: MPI_Integer ! Integer data type
201 use mpi_siesta, only: MPI_Character ! Character data type202 use mpi_siesta, only: MPI_Character ! Character data type
202 use mpi_siesta, only: MPI_Double_Precision ! Real double precision type203 use mpi_siesta, only: MPI_Double_Precision ! Real double precision type
@@ -372,6 +373,9 @@
372 // trim(label) // ' 2> /dev/null')373 // trim(label) // ' 2> /dev/null')
373 endif374 endif
374375
376 ! Set the whole-siesta-instance (dft calc) communicator
377 MPI_Comm_DFT = MPI_Comm_Siesta
378
375#endif379#endif
376380
377! Initialize flags and counters381! Initialize flags and counters
378382
=== modified file 'Src/initparallel.F'
--- Src/initparallel.F 2018-11-23 10:32:56 +0000
+++ Src/initparallel.F 2019-01-16 11:00:58 +0000
@@ -16,7 +16,7 @@
16 use fdf16 use fdf
17 use siesta_options, only: want_domain_decomposition17 use siesta_options, only: want_domain_decomposition
18 use siesta_options, only: want_spatial_decomposition18 use siesta_options, only: want_spatial_decomposition
19 use siesta_options, only: isolve, SOLVE_ORDERN, SOLVE_PEXSI19 use siesta_options, only: isolve, SOLVE_ORDERN
20 use siesta_options, only: rcoor20 use siesta_options, only: rcoor
21 use parallel, only : Node, Nodes, BlockSize, ProcessorY21 use parallel, only : Node, Nodes, BlockSize, ProcessorY
22 use parallelsubs, only : WhichNodeOrb, GetNodeOrbs22 use parallelsubs, only : WhichNodeOrb, GetNodeOrbs
2323
=== added file 'Src/m_elsi_interface.F90'
--- Src/m_elsi_interface.F90 1970-01-01 00:00:00 +0000
+++ Src/m_elsi_interface.F90 2019-01-16 11:00:58 +0000
@@ -0,0 +1,1601 @@
1!
2! Alberto Garcia, February-July 2018
3! Modified by Victor M. Garcia-Suarez. January 2019
4! (Introduction of an extra flag to only get the EDM instead of carrying
5! out a full computation --- to be refactored)
6!
7! ELSI DM-based interface to Siesta. It uses the sparse matrices from Siesta,
8! and obtains the DM (and optionally the EDM) matrices in sparse form.
9!
10! The elsi_getdm routine is in principle able to perform (spin-polarized)
11! calculations for real matrices (i.e., at the Gamma point), and periodic
12! calculations for complex matrices (i.e., multiple k-points), including spin.
13!
14! The structure of the solver routine is such that it will detect when it is
15! called for the first scf step, so it can perform any needed initialization.
16! The same idea can be used for the diagonalization mode, with (more extensive)
17! appropriate changes.
18!
19! The module also exports the "elsi_finalize_scfloop" routine, to be called from
20! the appropriate place. Some variables are kept at the module level for this.
21!
22! Usage: Compile Siesta with -DSIESTA__ELSI
23! Define
24! SolutionMethod ELSI
25! in the fdf file
26! TODO:
27! - MPI.Nprocs.SIESTA is not working -- maybe we will remove that feature
28! - Add more documentation
29! - Maybe refactor a few things
30!
31module m_elsi_interface
32
33#if SIESTA__ELSI
34
35 use precision, only: dp
36 use elsi
37
38 implicit none
39
40 private
41
42 integer, parameter :: ELPA_SOLVER = 1 ! solver
43 integer, parameter :: OMM_SOLVER = 2 ! solver
44 integer, parameter :: PEXSI_SOLVER = 3 ! solver
45 integer, parameter :: SIPS_SOLVER = 5 ! solver
46 integer, parameter :: NTPOLY_SOLVER = 6 ! solver
47 integer, parameter :: MULTI_PROC = 1 ! parallel_mode
48 integer, parameter :: SIESTA_CSC = 2 ! distribution
49 integer, parameter :: GAUSSIAN = 0 ! broadening
50 integer, parameter :: FERMI = 1 ! broadening
51 integer, parameter :: METHFESSEL_PAXTON = 2 ! broadening
52 integer, parameter :: CUBIC = 3 ! broadening
53 integer, parameter :: COLD = 4 ! broadening
54 integer, parameter :: ELSI_NOT_SET = -910910
55
56 type(elsi_handle) :: elsi_h
57
58 integer :: elsi_global_comm ! Used by all routines. Freed at end of scf loop
59 integer :: which_solver
60 integer :: which_broad
61 integer :: out_level
62 integer :: out_json
63 integer :: mp_order
64 integer :: elpa_flavor
65 integer :: omm_flavor
66 integer :: omm_n_elpa
67 integer :: pexsi_tasks_per_pole
68 integer :: pexsi_tasks_symbolic
69 integer :: sips_n_slice
70 integer :: sips_n_elpa
71 integer :: ntpoly_method
72
73 real(dp) :: omm_tol
74 real(dp) :: ntpoly_filter
75 real(dp) :: ntpoly_tol
76
77 character(len=6) :: solver_string
78 character(len=5) :: broad_string
79
80 real(dp), allocatable :: v_old(:,:) ! For mu update in PEXSI solver
81 real(dp), allocatable :: delta_v(:,:) ! For mu update in PEXSI solver
82
83 public :: elsi_getdm
84 public :: elsi_finalize_scfloop
85 public :: elsi_save_potential
86
87CONTAINS
88
89subroutine elsi_getdm(iscf, no_s, nspin, no_l, maxnh, no_u, &
90 numh, listhptr, listh, H, S, qtot, temp, &
91 xijo, nkpnt, kpoint, kweight, &
92 eo, qo, Dscf, Escf, ef, Entropy, occtol, neigwanted, Get_EDM_Only)
93
94 !
95 ! Analogous to 'diagon', it dispatches ELSI solver routines as needed
96 !
97
98 use m_fold_auxcell, only: fold_sparse_arrays ! Could be called in state_init
99
100! Missing for now
101! eo(no_u,nspin,nk) : Eigenvalues
102! qo(no_u,nspin,nk) : Occupations of eigenstates
103
104
105 real(dp), intent(inout) :: H(:,:), S(:) ! Note: we might overwrite these
106 integer, intent(in) :: iscf, maxnh, no_u, no_l, no_s, nkpnt
107 integer, intent(in) :: neigwanted, nspin
108 integer, intent(in) :: listh(maxnh), numh(no_l), listhptr(no_l)
109 real(dp), intent(in) :: kpoint(3,nkpnt), qtot, temp, kweight(nkpnt), occtol,xijo(3,maxnh)
110
111 real(dp), intent(out) :: Dscf(maxnh,nspin), ef, Escf(maxnh,nspin), Entropy
112 real(dp), intent(out) :: eo(no_u,nspin,nkpnt), qo(no_u,nspin,nkpnt)
113
114 ! Interim flag to just get the EDM
115 ! Note that, if .true., the DM is NOT obtained
116 ! This needs to be refactored
117
118 logical, intent(in) :: Get_EDM_Only
119
120
121 logical :: gamma, using_aux_cell
122 integer, allocatable :: numh_u(:), listhptr_u(:), listh_u(:)
123 integer, allocatable :: ind2ind_u(:)
124
125 !
126 real(dp), allocatable, dimension(:,:) :: Dscf_u, Escf_u, H_u
127 real(dp), allocatable, dimension(:) :: S_u
128
129 integer :: iuo, ispin, j, ind, ind_u, nnz_u
130
131 external die
132
133 gamma = ((nkpnt == 1) .and. (sum(abs(kpoint(:,1))) == 0.0_dp))
134 using_aux_cell = (no_s /= no_u)
135
136 if (gamma) then
137
138 if (.not. using_aux_cell) then
139
140 call elsi_real_solver(iscf, no_u, no_l, nspin, &
141 maxnh, listhptr, listh, qtot, temp, &
142 H, S, Dscf, Escf, ef, Entropy, Get_EDM_Only)
143
144 else
145
146 ! We fold-in the sparse arrays so that they add up all the
147 ! matrix elements whose 'column' supercell index maps to the same
148 ! unit-cell column. Note that they are still sparse.
149
150 ! First, determine the appropriate index arrays (ending in _u), and
151 ! a mapper ind2ind_u
152
153 allocate(numh_u(no_l), listhptr_u(no_l))
154 call fold_sparse_arrays(no_l,no_u, &
155 numh,listhptr,maxnh,listh, &
156 numh_u,listhptr_u,nnz_u,listh_u,ind2ind_u)
157
158 allocate(H_u(nnz_u,nspin), S_u(nnz_u))
159 S_u = 0
160 H_u = 0
161 do iuo = 1,no_l
162 do j = 1,numh(iuo)
163 ind = listhptr(iuo) + j
164 ind_u = ind2ind_u(ind)
165 S_u(ind_u) = S_u(ind_u) + S(ind)
166 do ispin = 1, nspin
167 H_u(ind_u,ispin) = H_u(ind_u,ispin) + H(ind,ispin)
168 enddo
169 enddo
170 enddo
171
172 ! We can now call the standard real solver routine
173 allocate(Dscf_u(nnz_u,nspin), Escf_u(nnz_u,nspin))
174
175 call elsi_real_solver(iscf, no_u, no_l, nspin, &
176 nnz_u, listhptr_u, listh_u, qtot, temp, &
177 H_u, S_u, Dscf_u, Escf_u, ef, Entropy, Get_EDM_Only)
178
179 deallocate(H_u, S_u)
180 deallocate(numh_u, listhptr_u, listh_u)
181
182
183 ! Unfold. We put the same '_u' DM entry into all image slots.
184 do iuo = 1,no_l
185 do j = 1,numh(iuo)
186 ind = listhptr(iuo) + j
187 ind_u = ind2ind_u(ind)
188 do ispin = 1, nspin
189 Dscf(ind,ispin) = Dscf_u(ind_u,ispin)
190 Escf(ind,ispin) = Escf_u(ind_u,ispin)
191 enddo
192 enddo
193 enddo
194 deallocate(Dscf_u, Escf_u)
195 deallocate(ind2ind_u)
196
197 endif ! using auxiliary supercell with Gamma sampling
198
199 else
200
201 ! We need more preparation
202 call elsi_kpoints_dispatcher(iscf, no_s, nspin, no_l, maxnh, no_u, &
203 numh, listhptr, listh, H, S, qtot, temp, &
204 xijo, nkpnt, kpoint, kweight, &
205 eo, qo, Dscf, Escf, ef, Entropy, occtol, neigwanted, Get_EDM_Only)
206
207 endif
208
209end subroutine elsi_getdm
210
211subroutine elsi_get_opts()
212
213 use fdf, only: fdf_get
214
215 solver_string = fdf_get("ELSISolver", "elpa")
216 out_level = fdf_get("ELSIOutputLevel", 0)
217 out_json = fdf_get("ELSIOutputJson", 1)
218 broad_string = fdf_get("ELSIBroadeningMethod", "fermi")
219 mp_order = fdf_get("ELSIBroadeningMPOrder", 1)
220 elpa_flavor = fdf_get("ELSIELPAFlavor", 2)
221 omm_flavor = fdf_get("ELSIOMMFlavor", 0)
222 omm_n_elpa = fdf_get("ELSIOMMELPASteps", 3)
223 omm_tol = fdf_get("ELSIOMMTolerance", 1.0e-9_dp)
224 pexsi_tasks_per_pole = fdf_get("ELSIPEXSITasksPerPole", ELSI_NOT_SET)
225 pexsi_tasks_symbolic = fdf_get("ELSIPEXSITasksSymbolic", 1)
226 sips_n_slice = fdf_get("ELSISIPSSlices", ELSI_NOT_SET)
227 sips_n_elpa = fdf_get("ELSISIPSELPASteps", 2)
228 ntpoly_method = fdf_get("ELSINTPOLYMethod", 2)
229 ntpoly_filter = fdf_get("ELSINTPOLYFilter", 1.0e-9_dp)
230 ntpoly_tol = fdf_get("ELSINTPOLYTolerance", 1.0e-6_dp)
231
232 select case (solver_string)
233 case ("elpa", "ELPA")
234 which_solver = ELPA_SOLVER
235 case ("omm", "OMM")
236 which_solver = OMM_SOLVER
237 case ("pexsi", "PEXSI")
238 which_solver = PEXSI_SOLVER
239 case ("sips", "SIPS", "SIPs")
240 which_solver = SIPS_SOLVER
241 case ("ntpoly", "NTPOLY", "NTPoly")
242 which_solver = NTPOLY_SOLVER
243 case default
244 which_solver = ELPA_SOLVER
245 end select
246
247 select case (broad_string)
248 case ("gauss")
249 which_broad = GAUSSIAN
250 case ("fermi")
251 which_broad = FERMI
252 case ("mp")
253 which_broad = METHFESSEL_PAXTON
254 case ("cubic")
255 which_broad = CUBIC
256 case ("cold")
257 which_broad = COLD
258 case default
259 which_broad = FERMI
260 end select
261
262end subroutine elsi_get_opts
263
264!======================================================================================
265! ELSI takes 1D block-cyclic distributed CSC/CSR matrices as its
266! input/output.
267!
268! But note taht this version assumes *the same* distributions for Siesta (setup_H et al) and ELSI
269! operations.
270!
271subroutine elsi_real_solver(iscf, n_basis, n_basis_l, n_spin, nnz_l, row_ptr, &
272 col_idx, qtot, temp, ham, ovlp, dm, edm, ef, ets, Get_EDM_Only)
273
274 use m_mpi_utils, only: globalize_sum
275 use parallel, only: BlockSize
276#ifdef MPI
277 use mpi_siesta
278#endif
279 use class_Distribution
280 use m_redist_spmatrix, only: aux_matrix, redistribute_spmatrix
281 use alloc
282
283 implicit none
284
285 integer, intent(in) :: iscf ! SCF step counter
286 integer, intent(in) :: n_basis ! Global basis
287 integer, intent(in) :: n_basis_l ! Local basis
288 integer, intent(in) :: n_spin
289 integer, intent(in) :: nnz_l ! Local nonzero
290 integer, intent(in) :: row_ptr(n_basis_l)
291 integer, intent(in), target :: col_idx(nnz_l)
292 real(dp), intent(in) :: qtot
293 real(dp), intent(in) :: temp
294 real(dp), intent(inout), target :: ham(nnz_l,n_spin)
295 real(dp), intent(inout), target :: ovlp(nnz_l)
296 real(dp), intent(out) :: dm(nnz_l,n_spin)
297 real(dp), intent(out) :: edm(nnz_l,n_spin)
298 real(dp), intent(out) :: ef ! Fermi energy
299 real(dp), intent(out) :: ets ! Entropy/k, dimensionless
300
301 integer :: ierr
302 integer :: n_state
303 integer :: nnz_g
304
305 real(dp) :: energy
306
307 integer, allocatable, dimension(:) :: row_ptr2
308 integer, allocatable, dimension(:), target :: numh
309
310 integer :: elsi_Spatial_comm, elsi_Spin_comm
311
312 type(distribution) :: dist_global
313 type(distribution) :: dist_spin(2)
314
315 type(aux_matrix) :: pkg_global, pkg_spin ! Packages for transfer
316
317 integer :: my_spin
318
319 integer :: my_no_l
320 integer :: my_nnz_l
321 integer :: my_nnz
322 integer, pointer :: my_row_ptr2(:) => null()
323 integer :: i, ih, ispin, spin_rank
324 real(dp) :: ets_spin
325
326 integer, pointer :: my_col_idx(:)
327 real(dp), pointer :: my_S(:)
328 real(dp), pointer :: my_H(:)
329 real(dp), pointer :: my_DM(:) => null()
330 real(dp), pointer :: my_EDM(:) => null()
331
332 integer :: date_stamp
333
334 logical :: Get_EDM_Only
335
336 external :: timer
337
338#ifndef MPI
339 call die("This ELSI solver interface needs MPI")
340#endif
341
342 ! Global communicator is a duplicate of passed communicator
343 call MPI_Comm_Dup(MPI_Comm_DFT, elsi_global_comm, ierr)
344
345 call timer("elsi", 1)
346
347 ! Initialization
348 if (iscf == 1) then
349
350 ! Get ELSI options
351 call elsi_get_opts()
352
353 ! Number of states to solve when calling an eigensolver
354 n_state = min(n_basis, n_basis/2+5)
355
356 ! Now we have all ingredients to initialize ELSI
357 call elsi_init(elsi_h, which_solver, MULTI_PROC, SIESTA_CSC, n_basis, &
358 qtot, n_state)
359
360 ! Output
361 call elsi_set_output(elsi_h, out_level)
362 call elsi_set_output_log(elsi_h, out_json)
363 call elsi_set_write_unit(elsi_h, 6)
364
365 ! Broadening
366 call elsi_set_mu_broaden_scheme(elsi_h, which_broad)
367 call elsi_set_mu_broaden_width(elsi_h, temp)
368 call elsi_set_mu_mp_order(elsi_h, mp_order)
369
370 ! Solver settings
371 call elsi_set_elpa_solver(elsi_h, elpa_flavor)
372
373 call elsi_set_omm_flavor(elsi_h, omm_flavor)
374 call elsi_set_omm_n_elpa(elsi_h, omm_n_elpa)
375 call elsi_set_omm_tol(elsi_h, omm_tol)
376
377 if (pexsi_tasks_per_pole /= ELSI_NOT_SET) then
378 call elsi_set_pexsi_np_per_pole(elsi_h, pexsi_tasks_per_pole)
379 end if
380
381 call elsi_set_pexsi_np_symbo(elsi_h, pexsi_tasks_symbolic)
382 call elsi_set_pexsi_temp(elsi_h, temp)
383! call elsi_set_pexsi_gap(elsi_h, gap)
384! call elsi_set_pexsi_delta_e(elsi_h, delta_e)
385 call elsi_set_pexsi_mu_min(elsi_h, -10.0_dp)
386 call elsi_set_pexsi_mu_max(elsi_h, 10.0_dp)
387
388 if (sips_n_slice /= ELSI_NOT_SET) then
389 call elsi_set_sips_n_slice(elsi_h, sips_n_slice)
390 end if
391
392 call elsi_set_sips_n_elpa(elsi_h, sips_n_elpa)
393 call elsi_set_sips_interval(elsi_h, -10.0_dp, 10.0_dp)
394
395 call elsi_set_ntpoly_method(elsi_h, ntpoly_method)
396 call elsi_set_ntpoly_filter(elsi_h, ntpoly_filter)
397 call elsi_set_ntpoly_tol(elsi_h, ntpoly_tol)
398
399 endif ! iscf == 1
400
401 if (n_spin == 1) then
402
403 ! Sparsity pattern
404 call globalize_sum(nnz_l, nnz_g, comm=elsi_global_comm)
405
406 allocate(row_ptr2(n_basis_l+1))
407 row_ptr2(1:n_basis_l) = row_ptr(1:n_basis_l)+1
408 row_ptr2(n_basis_l+1) = nnz_l+1
409
410 call elsi_set_csc(elsi_h, nnz_g, nnz_l, n_basis_l, col_idx, row_ptr2)
411 deallocate(row_ptr2)
412
413 call elsi_set_csc_blk(elsi_h, BlockSize)
414 call elsi_set_mpi(elsi_h, elsi_global_comm)
415
416 else
417
418 ! MPI logic for spin polarization
419
420 ! Re-create numh, as we use it in the transfer
421 allocate(numh(n_basis_l))
422 numh(1) = row_ptr(2)
423 do i = 2, n_basis_l-1
424 numh(i) = row_ptr(i+1)-row_ptr(i)
425 enddo
426 numh(n_basis_l) = nnz_l - row_ptr(n_basis_l)
427
428 ! Split the communicator in spins and get distribution objects
429 ! for the data redistribution needed
430 ! Note that dist_spin is an array
431 call get_spin_comms_and_dists(elsi_global_comm,elsi_global_comm, &
432 blocksize, n_spin, &
433 dist_global,dist_spin, elsi_spatial_comm, elsi_spin_comm)
434
435 ! Find out which spin team we are in, and tag the spin we work on
436 call mpi_comm_rank( elsi_Spin_Comm, spin_rank, ierr )
437 my_spin = spin_rank+1 ! {1,2}
438
439
440 ! This is done serially, each time filling one spin set
441 ! Note that **all processes** need to have the same pkg_global
442
443 do ispin = 1, n_spin
444
445 ! Load pkg_global data package
446 pkg_global%norbs = n_basis
447 pkg_global%no_l = n_basis_l
448 pkg_global%nnzl = nnz_l
449 pkg_global%numcols => numh
450 pkg_global%cols => col_idx
451
452 allocate(pkg_global%vals(2))
453 ! Link the vals items to the appropriate arrays (no extra memory here)
454 pkg_global%vals(1)%data => ovlp(:)
455 ! Note that we *cannot* say => ham(:,my_spin)
456 ! and avoid the sequential loop, as then half the processors will send
457 ! the information for 'spin up' and the other half the information for 'spin down',
458 ! which is *not* what we want.
459 pkg_global%vals(2)%data => ham(:,ispin)
460
461 call timer("redist_orbs_fwd", 1)
462
463 ! We are doing the transfers sequentially. One spin team is
464 ! 'idle' (in the receiving side) in each pass, as the dist_spin(ispin) distribution
465 ! does not involve them.
466
467 call redistribute_spmatrix(n_basis,pkg_global,dist_global, &
468 pkg_spin,dist_spin(ispin),elsi_global_Comm)
469
470 call timer("redist_orbs_fwd", 2)
471
472 if (my_spin == ispin) then ! Each team gets their own data
473
474 !nrows = pkg_spin%norbs ! or simply 'norbs'
475 my_no_l = pkg_spin%no_l
476 my_nnz_l = pkg_spin%nnzl
477 call MPI_AllReduce(my_nnz_l,my_nnz,1,MPI_integer,MPI_sum,elsi_Spatial_Comm,ierr)
478 ! generate off-by-one row pointer
479 call re_alloc(my_row_ptr2,1,my_no_l+1,"my_row_ptr2","elsi_solver")
480 my_row_ptr2(1) = 1
481 do ih = 1,my_no_l
482 my_row_ptr2(ih+1) = my_row_ptr2(ih) + pkg_spin%numcols(ih)
483 enddo
484
485 my_col_idx => pkg_spin%cols
486 my_S => pkg_spin%vals(1)%data
487 my_H => pkg_spin%vals(2)%data
488
489 call re_alloc(my_DM,1,my_nnz_l,"my_DM","elsi_solver")
490 call re_alloc(my_EDM,1,my_nnz_l,"my_EDM","elsi_solver")
491 endif
492
493 ! Clean pkg_global
494 nullify(pkg_global%vals(1)%data)
495 nullify(pkg_global%vals(2)%data)
496 deallocate(pkg_global%vals)
497 nullify(pkg_global%numcols)
498 nullify(pkg_global%cols)
499
500 enddo
501
502 call elsi_set_csc(elsi_h, my_nnz, my_nnz_l, my_no_l, my_col_idx, my_row_ptr2)
503 call de_alloc(my_row_ptr2,"my_row_ptr2","elsi_solver")
504
505 call elsi_set_csc_blk(elsi_h, BlockSize)
506 call elsi_set_spin(elsi_h, n_spin, my_spin)
507 call elsi_set_mpi(elsi_h, elsi_Spatial_comm)
508 call elsi_set_mpi_global(elsi_h, elsi_global_comm)
509
510 endif ! n_spin
511
512 call timer("elsi-solver", 1)
513
514 if (n_spin == 1) then
515 if (.not.Get_EDM_Only) then
516 call elsi_dm_real_sparse(elsi_h, ham, ovlp, DM, energy)
517 call elsi_get_entropy(elsi_h, ets)
518 else
519 call elsi_get_edm_real_sparse(elsi_h, EDM)
520 endif
521 else
522 ! Solve DM, and get (at every step for now) EDM, Fermi energy, and entropy
523 ! Energy is already summed over spins
524 if (.not.Get_EDM_Only) then
525 call elsi_dm_real_sparse(elsi_h, my_H, my_S, my_DM, energy)
526 call elsi_get_entropy(elsi_h, ets_spin)
527 else
528 call elsi_get_edm_real_sparse(elsi_h, my_EDM)
529 endif
530#ifdef SIESTA__ELSI__OLD_SPIN_CONVENTION
531 ! NOTE**
532 ! For ELSI versions before 2018-08-17 we still need to sum the entropy over spins
533 ! ... but to figure out the version is not trivial, as the relevant routines changed...
534 call globalize_sum(ets_spin, ets, comm=elsi_Spin_comm)
535#else
536 ! If this gives an error, then your version of ELSI is old, and you should
537 ! be using the pre-processor symbol above
538 ! (This works because 'elsi_get_datestamp' was added just
539 ! around the same time as the spin-reduction-convention change) (Aug 17 vs Aug 21, 2018...)
540 ! If you are unlucky enough to have an ELSI version in between, get rid of it.
541 call elsi_get_datestamp(date_stamp)
542#endif
543 endif
544
545 call elsi_get_mu(elsi_h, ef)
546 ets = ets/temp
547
548 ! Ef, energy, and ets are known to all nodes
549
550 call timer("elsi-solver", 2)
551
552 if ( n_spin == 2) then
553 ! Now we need to redistribute back
554
555 do ispin = 1, n_spin
556
557 if (my_spin == ispin) then
558 ! Prepare pkg_spin to transfer the right spin information
559 ! The other fields (numcols, cols) are the same and are still there
560 ! Deallocate my_S and my_H
561 call de_alloc(pkg_spin%vals(1)%data,"pkg_spin%vals(1)%data","elsi_solver")
562 call de_alloc(pkg_spin%vals(2)%data,"pkg_spin%vals(2)%data","elsi_solver")
563
564 pkg_spin%vals(1)%data => my_DM(1:my_nnz_l)
565 pkg_spin%vals(2)%data => my_EDM(1:my_nnz_l)
566
567 endif
568
569 ! pkg_global is clean now
570 call timer("redist_orbs_bck", 1)
571 call redistribute_spmatrix(n_basis,pkg_spin,dist_spin(ispin) &
572 ,pkg_global,dist_global,elsi_global_Comm)
573 call timer("redist_orbs_bck", 2)
574
575 ! Clean pkg_spin
576 if (my_spin == ispin) then
577 ! Each team deallocates during "its" spin cycle
578 call de_alloc(my_DM, "my_DM", "elsi_solver")
579 call de_alloc(my_EDM,"my_EDM","elsi_solver")
580
581 nullify(pkg_spin%vals(1)%data) ! formerly pointing to DM
582 nullify(pkg_spin%vals(2)%data) ! formerly pointing to EDM
583 deallocate(pkg_spin%vals)
584 ! allocated in the direct transfer
585 call de_alloc(pkg_spin%numcols,"pkg_spin%numcols","elsi_solver")
586 call de_alloc(pkg_spin%cols, "pkg_spin%cols", "elsi_solver")
587 endif
588
589
590 ! In future, pkg_global%vals(1,2) could be pointing to DM and EDM,
591 ! and the 'redistribute' routine check whether the vals arrays are
592 ! associated, to use them instead of allocating them.
593 DM(:,ispin) = pkg_global%vals(1)%data(:)
594 EDM(:,ispin) = pkg_global%vals(2)%data(:)
595 ! Check no_l
596 if (n_basis_l /= pkg_global%no_l) then
597 call die("Mismatch in no_l")
598 endif
599 ! Check listH
600 if (any(col_idx(:) /= pkg_global%cols(:))) then
601 call die("Mismatch in listH")
602 endif
603
604 ! Clean pkg_global
605 ! allocated by the transfer routine, but we did not actually
606 ! look at them
607 call de_alloc(pkg_global%numcols,"pkg_global%numcols","elsi_solver")
608 call de_alloc(pkg_global%cols, "pkg_global%cols", "elsi_solver")
609
610 call de_alloc(pkg_global%vals(1)%data,"pkg_global%vals(1)%data","elsi_solver")
611 call de_alloc(pkg_global%vals(2)%data,"pkg_global%vals(2)%data","elsi_solver")
612 deallocate(pkg_global%vals)
613
614 enddo
615
616 call MPI_Comm_Free(elsi_Spatial_comm, ierr)
617 call MPI_Comm_Free(elsi_Spin_comm, ierr)
618
619 endif
620
621 call timer("elsi", 2)
622
623
624end subroutine elsi_real_solver
625
626subroutine elsi_kpoints_dispatcher(iscf, no_s, nspin, no_l, maxnh, no_u, &
627 numh, listhptr, listh, H, S, qtot, temp, &
628 xijo, nkpnt, kpoint, kweight, &
629 eo, qo, Dscf, Escf, ef, Entropy, occtol, neigwanted, Get_EDM_Only)
630
631 use mpi_siesta, only: mpi_comm_dft
632 use mpi
633 use parallel, only: blocksize
634 use class_Distribution
635 use m_redist_spmatrix, only: aux_matrix, redistribute_spmatrix
636 use alloc
637
638 !
639 ! K-point redistribution, Hk and Sk building, and call to complex ELSI solver
640 !
641
642 use m_fold_auxcell, only: fold_sparse_arrays ! Could be called in state_init
643
644! Missing for now
645! eo(no_u,nspin,nk) : Eigenvalues
646! qo(no_u,nspin,nk) : Occupations of eigenstates
647
648
649 real(dp), intent(in), target :: H(:,:), S(:) ! Note that now we do not change them
650 integer, intent(in) :: iscf, maxnh, no_u, no_l, no_s, nkpnt
651 integer, intent(in) :: neigwanted, nspin
652 integer, intent(in) :: listhptr(no_l)
653 integer, intent(in), target :: listh(maxnh), numh(no_l)
654
655 real(dp), intent(out) :: Dscf(maxnh,nspin), ef, Escf(maxnh,nspin), Entropy
656 real(dp), intent(out) :: eo(no_u,nspin,nkpnt), qo(no_u,nspin,nkpnt)
657 real(dp), intent(in) :: kpoint(3,nkpnt), qtot, temp, kweight(nkpnt), occtol,xijo(3,maxnh)
658
659
660 integer :: mpirank, kcolrank, npGlobal
661 integer :: npPerK, color, my_kpt_n
662
663 integer :: kpt_comm, kpt_col_comm
664 integer :: Global_Group, kpt_Group
665
666 integer, allocatable :: ranks_in_world(:), ranks_in_world_AllK(:,:)
667 type(distribution) :: dist_global
668 type(distribution), allocatable :: dist_k(:)
669
670 type(aux_matrix) :: pkg_global, pkg_k
671 integer :: nvals
672 real(dp), allocatable, target :: xijo_transp(:,:)
673 real(dp), allocatable :: my_xijo_transp(:,:)
674 real(dp), allocatable :: my_xij(:,:)
675
676 integer :: my_no_l, my_nnz_l
677 integer, pointer :: my_numh(:)
678 integer, pointer :: my_listh(:)
679 integer, pointer :: my_listhptr(:) => null()
680 real(dp), allocatable :: my_S(:)
681 real(dp), allocatable :: my_H(:,:)
682 real(dp), pointer :: buffer(:) ! for unpacking help
683
684 real(dp), allocatable :: my_Escf(:,:)
685 real(dp), allocatable :: my_Dscf(:,:)
686 real(dp), allocatable, target :: my_Escf_reduced(:,:)
687 real(dp), allocatable, target :: my_Dscf_reduced(:,:)
688
689 complex(dp), allocatable :: DM_k(:,:)
690 complex(dp), allocatable :: EDM_k(:,:)
691 complex(dp), allocatable :: Hk(:,:)
692 complex(dp), allocatable :: Sk(:)
693
694 integer :: iuo, j, ind, ind_u, ispin
695 real(dp) :: kxij, my_kpoint(3)
696 complex(dp) :: kphs
697
698 integer :: i, ih, ik, ierr
699
700 integer, allocatable :: numh_u(:), listhptr_u(:), listh_u(:), ind2ind_u(:)
701 integer :: nnz_u
702
703 logical :: Get_EDM_Only
704
705 external die
706
707 ! Split the global communicator
708 ! Re-distribute H and S to the k-point (and spin) teams
709 ! Generate Hk, Sk
710 ! Call elsi_complex_solver
711 ! Construct and re-distribute global DM and EDM
712
713 ! Global communicator is a duplicate of passed communicator
714 call MPI_Comm_Dup(MPI_Comm_DFT, elsi_global_comm, ierr)
715 call MPI_Comm_Rank(elsi_global_comm, mpirank, ierr)
716 call MPI_Comm_Size(elsi_global_comm, npGlobal, ierr)
717
718 ! Split elsi_global_comm
719
720 npPerK = npGlobal/nkpnt
721
722 ! Options for split: As many groups as nkpnt, so numbering is trivial
723 color = mpirank/npPerK ! : color 0: 0,1,2,3 ; color 1: 4,5,6,7
724 call MPI_Comm_Split(elsi_global_comm, color, mpirank, kpt_Comm, ierr)
725 my_kpt_n = 1 + color ! 1 + mpirank/npPerK
726 ! Column communicator
727 color = mod(mpirank, npPerK) ! : color 0: 0,4 1: 1,5 2: 2,6 3: 3,7
728 call MPI_Comm_Split(elsi_global_comm, color, mpirank, kpt_col_Comm, ierr) ! OK to use mpirank: just ordering
729 call MPI_Comm_Rank(kpt_col_comm, kcolrank, ierr)
730
731 !print *, mpirank, "| ", "k-point ", my_kpt_n, " rank in col:", kcolrank
732
733 ! Create distribution for k-point group
734 call MPI_Comm_Group(elsi_global_comm, Global_Group, Ierr)
735 call MPI_Comm_Group(kpt_Comm, kpt_Group, Ierr)
736 allocate(ranks_in_world(npPerK))
737 call MPI_Group_translate_ranks( kpt_Group, npPerK, &
738 (/ (i,i=0,npPerK-1) /), &
739 Global_Group, ranks_in_world, ierr )
740
741 call MPI_Group_Free(kpt_Group, ierr)
742
743 allocate (ranks_in_World_AllK(npPerK,nkpnt))
744 ! We need everybody to have this information, as all nodes
745 ! are part of the global distribution side of the communication
746 ! (global_comm is the common-address space)
747
748 ! Use the k-column communicator
749 call MPI_AllGather(ranks_in_world,npPerK,MPI_integer,&
750 Ranks_in_World_AllK(1,1),npPerK, &
751 MPI_integer,kpt_col_Comm,ierr)
752
753 allocate(dist_k(nkpnt))
754 ! Create distributions known to all nodes (again, those in base_comm)
755 do ik = 1, nkpnt
756 call newDistribution(dist_k(ik), elsi_global_Comm, &
757 Ranks_in_World_AllK(:,ik), &
758 TYPE_BLOCK_CYCLIC, blockSize, "kpt dist")
759 enddo
760 deallocate(ranks_in_world,Ranks_in_World_AllK)
761 call MPI_Barrier(elsi_global_Comm,ierr)
762
763 call newDistribution(dist_global,elsi_global_Comm, (/ (i, i=0, npGlobal-1) /), &
764 TYPE_BLOCK_CYCLIC,BlockSize,"global dist")
765 call MPI_Barrier(elsi_global_Comm,ierr)
766
767 ! Redistribute arrays
768
769 ! No need to go sequentially over k-points
770 ! Load pkg_global data package
771 pkg_global%norbs = no_u
772 pkg_global%no_l = no_l
773 pkg_global%nnzl = maxnh
774 pkg_global%numcols => numh
775 pkg_global%cols => listh
776
777 nvals = 1 + 3 + nspin ! S, xijo (tranposed), H(:,nspin)
778 allocate(pkg_global%vals(nvals))
779 ! Link the vals items to the appropriate arrays (no extra memory here)
780 pkg_global%vals(1)%data => S(:)
781 call transpose(xijo,xijo_transp)
782 do i = 1, 3
783 pkg_global%vals(1+i)%data => xijo_transp(:,i)
784 enddo
785 do ispin = 1, nspin
786 pkg_global%vals(4+ispin)%data => H(:,ispin)
787 enddo
788
789 ! Do sequential
790 do ik=1, nkpnt
791
792 call timer("redist_orbs_fwd", 1)
793 call redistribute_spmatrix(no_u,pkg_global,dist_global, &
794 pkg_k,dist_k(ik),elsi_global_Comm)
795 call timer("redist_orbs_fwd", 2)
796
797 if (my_kpt_n == ik) then
798 !------------------------------------------
799 ! Unpack info: real S and H (and index arrays) distributed over each kpt_comm
800 my_no_l = pkg_k%no_l
801 my_nnz_l = pkg_k%nnzl
802 my_numh => pkg_k%numcols
803
804 my_listh => pkg_k%cols
805
806 allocate(my_S(my_nnz_l))
807 my_S(:) = pkg_k%vals(1)%data(:)
808 call de_alloc(pkg_k%vals(1)%data)
809
810 allocate(my_xijo_transp(my_nnz_l,3))
811 do i = 1, 3
812 buffer => pkg_k%vals(1+i)%data
813 my_xijo_transp(:,i) = buffer(:)
814 call de_alloc(pkg_k%vals(1+i)%data)
815 enddo
816
817 allocate(my_H(my_nnz_l,nspin))
818 do ispin = 1, nspin
819 buffer => pkg_k%vals(4+ispin)%data
820 my_H(:,ispin) = buffer(:)
821 call de_alloc(pkg_k%vals(4+ispin)%data)
822 enddo
823 deallocate(pkg_k%vals)
824
825 ! Now we could clear the rest of pkg_k--
826 ! nullify(pkg_k%numcols)
827 ! nullify(pkg_k%cols)
828 !
829 ! but we need to remember to deallocate the actual arrays after use
830 ! it is probably safer to keep the pkg references
831 !---------------------------
832
833 endif
834 enddo !ik
835
836 ! Clean pkg_global -- This is safe, as we use pointers to data only
837 do i = 1, size(pkg_global%vals)
838 nullify(pkg_global%vals(i)%data)
839 enddo
840 deallocate(pkg_global%vals)
841 nullify(pkg_global%numcols)
842 nullify(pkg_global%cols)
843
844 deallocate(xijo_transp) ! Auxiliary array used for sending
845
846 ! generate listhptr for folding/unfolding operations
847 call re_alloc(my_listhptr,1,my_no_l,"my_listhptr","elsi_solver")
848 my_listhptr(1) = 0
849 do ih = 2,my_no_l
850 my_listhptr(ih) = my_listhptr(ih-1) + my_numh(ih-1)
851 enddo
852
853 ! The folded variables (*_u) are local to each team
854
855 allocate(numh_u(my_no_l), listhptr_u(my_no_l))
856 call fold_sparse_arrays(my_no_l,no_u, &
857 my_numh,my_listhptr,my_nnz_l,my_listh, &
858 numh_u,listhptr_u,nnz_u,listh_u,ind2ind_u)
859
860 !
861 call transpose(my_xijo_transp,my_xij)
862 deallocate(my_xijo_transp)
863
864 my_kpoint(:) = kpoint(:,my_kpt_n)
865
866 allocate(Hk(nnz_u,nspin), Sk(nnz_u))
867 Sk = 0
868 Hk = 0
869 do iuo = 1,my_no_l
870 do j = 1,my_numh(iuo)
871 ind = my_listhptr(iuo) + j
872 ind_u = ind2ind_u(ind)
873
874 kxij = my_kpoint(1) * my_xij(1,ind) + &
875 my_kpoint(2) * my_xij(2,ind) + &
876 my_kpoint(3) * my_xij(3,ind)
877 kphs = cdexp(dcmplx(0.0_dp, -1.0_dp)*kxij)
878
879 Sk(ind_u) = Sk(ind_u) + my_S(ind) *kphs
880 do ispin = 1, nspin
881 Hk(ind_u,ispin) = Hk(ind_u,ispin) + my_H(ind,ispin) *kphs
882 enddo
883 enddo
884 enddo
885
886 deallocate(my_S,my_H)
887 !
888
889 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done folding"
890 ! Prepare arrays for holding results
891 allocate(DM_k(nnz_u,nspin))
892 allocate(EDM_k(nnz_u,nspin))
893
894 call elsi_complex_solver(iscf, no_u, my_no_l, nspin, nnz_u, numh_u, listhptr_u, &
895 listh_u, qtot, temp, Hk, Sk, DM_k, EDM_k, Ef, Entropy, &
896 nkpnt, my_kpt_n, kpoint(:,my_kpt_n), kweight(my_kpt_n), &
897 kpt_Comm, Get_EDM_Only )
898
899 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done elsi_complex_solver"
900 deallocate(listhptr_u, numh_u, listh_u)
901 deallocate(Hk,Sk)
902
903 ! Re-create DM and EDM:
904 ! Unfold within a given k
905 ! Add up all the k contributions with the appropriate phases
906
907 ! Prepare arrays for holding results: Note sizes: these are folded out
908 allocate(my_Dscf(my_nnz_l,nspin))
909 allocate(my_Escf(my_nnz_l,nspin))
910
911 do iuo = 1, my_no_l
912 do j = 1, my_numh(iuo)
913 ind = my_listhptr(iuo) + j
914 ind_u = ind2ind_u(ind)
915
916 kxij = my_kpoint(1) * my_xij(1,ind) + &
917 my_kpoint(2) * my_xij(2,ind) + &
918 my_kpoint(3) * my_xij(3,ind)
919 kphs = cdexp(dcmplx(0.0_dp, +1.0_dp)*kxij)
920
921 do ispin = 1, nspin
922 my_Dscf(ind,ispin) = real( DM_k(ind_u,ispin) * kphs, kind=dp )
923 my_Escf(ind,ispin) = real( EDM_k(ind_u,ispin) * kphs, kind=dp )
924 enddo
925 enddo
926 enddo
927
928 ! Apply the k-point weight to the (apparently normalized) DM and EDM that
929 ! come out of ELSI
930
931 my_Dscf = kweight(my_kpt_n) * my_Dscf
932 my_Escf = kweight(my_kpt_n) * my_Escf
933
934 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done generating my_Dscf"
935 deallocate(my_xij)
936 deallocate(ind2ind_u)
937 deallocate(DM_k,EDM_k)
938
939 if (my_kpt_n == 1 ) then
940 ! Prepare arrays for holding reduced data
941 allocate(my_Dscf_reduced(my_nnz_l,nspin))
942 allocate(my_Escf_reduced(my_nnz_l,nspin))
943 if (kcolrank /= 0) call die("Rank 0 in kpt_comm not doing kpt 1")
944 else
945 ! These should not be referenced
946 allocate(my_Dscf_reduced(1,1))
947 allocate(my_Escf_reduced(1,1))
948 endif
949
950 ! Use k-point column communicator, and reduce to rank 0,
951 ! which *should* correspond to kpt=1... (checked above)
952 call MPI_Reduce( my_Dscf, my_Dscf_reduced, nspin*my_nnz_l, MPI_Double_Precision, &
953 MPI_Sum, 0, kpt_col_comm, ierr )
954 call MPI_Reduce( my_Escf, my_Escf_reduced, nspin*my_nnz_l, MPI_Double_Precision, &
955 MPI_Sum, 0, kpt_col_comm, ierr )
956
957 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done reducing my_Dscf"
958 deallocate(my_Dscf, my_Escf)
959
960 ! redistribute to global distribution, only from the first k-point
961
962 if (my_kpt_n == 1) then
963
964 ! These bits are still there *** update if we ever clean pkgs after use
965 ! pkg_k%norbs = no_u
966 ! pkg_k%no_l = my_no_l
967 ! pkg_k%nnzl = my_nnz_l
968 ! pkg_k%numcols => my_numh
969 ! pkg_k%cols => my_listh
970
971 nvals = 2*nspin ! DM, EDM
972 allocate(pkg_k%vals(nvals))
973 do ispin = 1, nspin
974 pkg_k%vals(ispin)%data => my_Dscf_reduced(:,ispin)
975 pkg_k%vals(nspin+ispin)%data => my_Escf_reduced(:,ispin)
976 enddo
977
978 endif
979
980 ! Everybody participates in the transfer in the receiving side, but
981 ! only kpt=1 in the sending side, because we are using dist_k(1)
982 call timer("redist_dm-edm_bck", 1)
983 call redistribute_spmatrix(no_u,pkg_k,dist_k(1), &
984 pkg_global,dist_global,elsi_global_Comm)
985 call timer("redist_dm-edm_bck", 2)
986
987 ! Deallocate aux arrays
988 deallocate(my_Dscf_reduced, my_Escf_reduced)
989 if (my_kpt_n == 1) then
990 !Clean pkg_k in sender
991 deallocate(pkg_k%vals) ! just pointers
992 endif
993
994 !Clean all pkg_k: These were actually allocated in the forward transfer
995 call de_alloc(pkg_k%numcols,"pkg_k%numcols")
996 call de_alloc(pkg_k%cols,"pkg_k%cols")
997
998 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done cleaning pkg_k"
999
1000 ! Unpack data (all processors, original global distribution)
1001 ! In future, pkg_global%vals(:) could be pointing to DM and EDM,
1002 ! and the 'redistribute' routine check whether the vals arrays are
1003 ! associated, to use them instead of allocating them.
1004 do ispin = 1, nspin
1005 Dscf(:,ispin) = pkg_global%vals(ispin)%data(:)
1006 Escf(:,ispin) = pkg_global%vals(nspin+ispin)%data(:)
1007 enddo
1008 ! Check no_l
1009 if (no_l /= pkg_global%no_l) then
1010 call die("Mismatch in no_l at end of kpoints-dispatcher")
1011 endif
1012 ! Check listH
1013 if (any(listh(:) /= pkg_global%cols(:))) then
1014 call die("Mismatch in listH at end of kpoints-dispatcher")
1015 endif
1016 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done Dscf"
1017
1018 ! Clean pkg_global
1019 call de_alloc(pkg_global%numcols,"pkg_global%numcols","elsi_solver")
1020 call de_alloc(pkg_global%cols, "pkg_global%cols", "elsi_solver")
1021 do i = 1, size(pkg_global%vals)
1022 call de_alloc(pkg_global%vals(i)%data,"pkg_global%vals%data","kpoints_dispatcher")
1023 enddo
1024 deallocate(pkg_global%vals)
1025 !print *, mpirank, "| ", "k-point ", my_kpt_n, " Done cleaning pkg_global"
1026
1027 ! Reduction of entropy over kpt_col_comm is not necessary
1028
1029 call MPI_Comm_Free(kpt_comm, ierr)
1030 call MPI_Comm_Free(kpt_col_comm, ierr)
1031
1032end subroutine elsi_kpoints_dispatcher
1033
1034! Clean up: Finalize ELSI instance and free MPI communicator.
1035!
1036subroutine elsi_finalize_scfloop()
1037
1038 integer :: ierr
1039
1040 ! Make which_solver invalid
1041 which_solver = ELSI_NOT_SET
1042
1043 if (allocated(v_old)) then
1044 deallocate(v_old)
1045 deallocate(delta_v)
1046 end if
1047
1048 call elsi_finalize(elsi_h)
1049
1050 call MPI_Comm_Free(elsi_global_comm, ierr)
1051
1052end subroutine elsi_finalize_scfloop
1053
1054! Save Hartree + XC potential and find minimum and maximum change of it between
1055! two SCF iterations.
1056!
1057subroutine elsi_save_potential(n_pts, n_spin, v_scf, comm)
1058
1059 use m_mpi_utils, only: globalize_min, globalize_max
1060 use parallel, only: ionode, node
1061 use units, only: eV
1062
1063 integer, intent(in) :: n_pts
1064 integer, intent(in) :: n_spin
1065 real(dp), intent(in) :: v_scf(n_pts,n_spin)
1066 integer , intent(in) :: comm ! The Siesta communicator used for grid operations
1067 ! since this routine is called from dhscf...
1068
1069 real(dp) :: mu_min
1070 real(dp) :: mu_max
1071 real(dp) :: dv_min
1072 real(dp) :: dv_max
1073 real(dp) :: tmp
1074
1075 if (which_solver == PEXSI_SOLVER) then
1076 if (.not. allocated(v_old)) then
1077 allocate(v_old(n_pts,n_spin))
1078 allocate(delta_v(n_pts,n_spin))
1079
1080 v_old = v_scf
1081
1082 mu_min = -10.0_dp
1083 mu_max = 10.0_dp
1084 else
1085 call elsi_get_pexsi_mu_min(elsi_h, mu_min)
1086 call elsi_get_pexsi_mu_max(elsi_h, mu_max)
1087 if (ionode) then
1088 print *, "*-- solver mu_min, mu_max:", mu_min/eV, mu_max/eV
1089 endif
1090
1091 delta_v = v_scf - v_old
1092 v_old = v_scf
1093
1094 ! Get minimum and maximum of change of total potential
1095 tmp = minval(delta_v)
1096
1097 call globalize_min(tmp, dv_min, comm=comm)
1098
1099 tmp = maxval(delta_v)
1100
1101 call globalize_max(tmp, dv_max, comm=comm)
1102 if (ionode) then
1103 print *, " *---- min and max Delta-V: ", dv_min/eV, dv_max/eV
1104 endif
1105
1106 mu_min = mu_min+dv_min
1107 mu_max = mu_max+dv_max
1108 end if
1109
1110 ! Adjust chemical potential range for PEXSI
1111 call elsi_set_pexsi_mu_min(elsi_h, mu_min)
1112 call elsi_set_pexsi_mu_max(elsi_h, mu_max)
1113 if (ionode) then
1114 print *, "*-- updated mu_min, mu_max:", mu_min/eV, mu_max/eV
1115 endif
1116 end if
1117
1118end subroutine elsi_save_potential
1119
1120! To-Do: Check the implications of distinguishing between 'global_comm'
1121! (common space for communications) and 'base_comm' (to be split)
1122! For the real case they can be the same
1123!
1124! This routine should be called by everybody in global_comm
1125
1126subroutine get_spin_comms_and_dists(global_comm, base_comm, blocksize, n_spin,&
1127 dist_global, dist_spin, spatial_comm, spin_comm)
1128
1129 use mpi_siesta
1130 use class_Distribution ! distribution, newDistribution, types
1131
1132 integer, intent(in) :: global_comm ! Communicator for global distribution
1133 ! Needed to include the actual global ranks
1134 ! in the distribution objects
1135
1136 integer, intent(in) :: base_comm ! Communicator to be split
1137 integer, intent(out) :: spatial_comm ! "Orbital" comm (one for each spin team)
1138 integer, intent(out) :: spin_comm ! "Inner" spin comm (for reductions)
1139
1140 integer, intent(in) :: blocksize
1141 integer, intent(in) :: n_spin ! should be 2 in this version
1142
1143 ! Note inout for bud objects
1144 type(distribution), intent(inout) :: dist_global ! global distribution object
1145 type(distribution), intent(inout) :: dist_spin(2) ! per-spin objects
1146
1147 ! Local variables
1148 integer :: color, base_rank
1149 integer :: npGlobal, npPerSpin
1150 integer, allocatable, dimension(:) :: global_ranks_in_world
1151 integer, allocatable, dimension(:) :: ranks_in_world ! should be 'spatial_ranks...'
1152 integer, allocatable, dimension(:,:) :: ranks_in_World_Spin
1153 integer :: global_group, Spatial_group
1154 integer :: i, ispin, ierr
1155
1156 call MPI_Comm_Size(global_comm, npGlobal, ierr)
1157 ! This distribution could be created by the caller...
1158 allocate(global_ranks_in_world(npGlobal))
1159 global_ranks_in_world = (/ (i, i=0, npGlobal-1) /)
1160 call newDistribution(dist_global,global_Comm,global_ranks_in_world, &
1161 TYPE_BLOCK_CYCLIC,BlockSize,"global dist")
1162 deallocate(global_ranks_in_world)
1163 call MPI_Barrier(global_Comm,ierr)
1164
1165 ! Split the base communicator
1166 call MPI_Comm_Rank(base_comm, base_rank, ierr)
1167 ! "Row" communicator for independent operations on each spin
1168 ! The name refers to "spatial" degrees of freedom.
1169 color = mod(base_rank,n_spin) ! {0,1} for n_spin = 2, or {0} for n_spin = 1
1170 call MPI_Comm_Split(base_comm, color, base_rank, Spatial_Comm, ierr)
1171 ! "Column" communicator for spin reductions
1172 color = base_rank/n_spin
1173 call MPI_Comm_Split(base_comm, color, base_rank, Spin_Comm, ierr)
1174
1175 call MPI_Comm_Size(spatial_comm, npPerSpin, ierr) ! number of procs in each spin team
1176
1177 call MPI_Comm_Group(Spatial_Comm, Spatial_Group, Ierr)
1178 allocate(ranks_in_world(npPerSpin))
1179 call MPI_Comm_Group(global_comm, Global_Group, Ierr)
1180 call MPI_Group_translate_ranks( Spatial_Group, npPerSpin, &
1181 (/ (i,i=0,npPerSpin-1) /), &
1182 Global_Group, ranks_in_world, ierr )
1183
1184 call MPI_Group_Free(Spatial_Group, ierr)
1185 call MPI_Group_Free(Global_Group, ierr)
1186
1187 allocate (ranks_in_World_Spin(npPerSpin,n_spin))
1188 ! We need everybody to have this information, as all nodes
1189 ! are part of the global distribution side of the communication
1190 ! (global_comm is the common-address space)
1191
1192 ! But note that this will tell only those in base_comm...
1193 ! This routine might not be the end of the story for cascading splits
1194 ! (k-points and spin)
1195 ! We might need an extra 'broadcast' over the k-point 'column' comm.
1196 call MPI_AllGather(ranks_in_world,npPerSpin,MPI_integer,&
1197 Ranks_in_World_Spin(1,1),npPerSpin, &
1198 MPI_integer,Spin_Comm,ierr)
1199
1200 ! Create distributions known to all nodes (again, those in base_comm)
1201 do ispin = 1, n_spin
1202 call newDistribution(dist_spin(ispin), global_Comm, &
1203 Ranks_in_World_Spin(:,ispin), &
1204 TYPE_BLOCK_CYCLIC, blockSize, "SPATIAL dist")
1205 enddo
1206 deallocate(ranks_in_world,Ranks_in_World_Spin)
1207 call MPI_Barrier(global_Comm,ierr)
1208
1209end subroutine get_spin_comms_and_dists
1210
1211#endif /* SIESTA__ELSI */
1212
1213#if SIESTA__ELSI || SIESTA__PEXSI
1214
1215subroutine elsi_complex_solver(iscf, n_basis, n_basis_l, n_spin, nnz_l, numh, row_ptr, &
1216 col_idx, qtot, temp, ham, ovlp, dm, edm, ef, ets, &
1217 nkpnt, kpt_n, kpt, weight, kpt_comm, Get_EDM_Only)
1218
1219 use m_mpi_utils, only: globalize_sum
1220 use parallel, only: BlockSize
1221#ifdef MPI
1222 use mpi_siesta
1223#endif
1224 use class_Distribution
1225 use m_redist_spmatrix, only: aux_matrix, redistribute_spmatrix
1226 use alloc
1227 use m_mpi_utils, only: globalize_sum
1228
1229 implicit none
1230
1231 integer, intent(in) :: iscf ! SCF step counter
1232 integer, intent(in) :: n_basis ! Global basis
1233 integer, intent(in) :: n_basis_l ! Local basis
1234 integer, intent(in) :: n_spin
1235 integer, intent(in) :: nnz_l ! Local nonzero
1236 integer, intent(in), target :: numh(n_basis_l)
1237 integer, intent(in) :: row_ptr(n_basis_l)
1238 integer, intent(in), target :: col_idx(nnz_l)
1239 real(dp), intent(in) :: qtot
1240 real(dp), intent(in) :: temp
1241 complex(dp), intent(inout), target :: ham(nnz_l,n_spin)
1242 complex(dp), intent(inout), target :: ovlp(nnz_l)
1243 complex(dp), intent(out) :: dm(nnz_l,n_spin)
1244 complex(dp), intent(out) :: edm(nnz_l,n_spin)
1245 real(dp), intent(out) :: ef ! Fermi energy
1246 real(dp), intent(out) :: ets ! Entropy/k, dimensionless
1247 integer, intent(in) :: nkpnt ! number of k-points
1248 integer, intent(in) :: kpt_n
1249 real(dp), intent(in) :: kpt(3:)
1250 real(dp), intent(in) :: weight
1251 integer, intent(in) :: kpt_comm
1252
1253 integer :: ierr
1254 integer :: n_state
1255 integer :: nnz_g
1256
1257 real(dp) :: energy
1258
1259 integer, allocatable, dimension(:) :: row_ptr2
1260
1261 integer :: elsi_Spatial_comm, elsi_Spin_comm
1262
1263 type(distribution) :: dist_global
1264 type(distribution) :: dist_spin(2)
1265
1266 type(aux_matrix) :: pkg_global, pkg_spin ! Packages for transfer
1267
1268 integer :: my_spin
1269
1270 integer :: my_no_l
1271 integer :: my_nnz_l
1272 integer :: my_nnz
1273 integer, pointer :: my_row_ptr2(:) => null()
1274 integer :: i, ih, ispin, spin_rank, global_rank
1275 real(dp) :: ets_spin
1276
1277 integer, pointer :: my_col_idx(:)
1278 complex(dp), pointer :: my_S(:)
1279 complex(dp), pointer :: my_H(:)
1280 complex(dp), pointer :: my_DM(:) => null()
1281 complex(dp), pointer :: my_EDM(:) => null()
1282
1283 integer :: date_stamp
1284
1285 logical :: Get_EDM_Only
1286
1287 external :: timer
1288
1289#ifndef MPI
1290 call die("This ELSI solver interface needs MPI")
1291#endif
1292
1293 call timer("elsi-complex-solver", 1)
1294
1295 ! Initialization
1296 if (iscf == 1) then
1297
1298 ! Get ELSI options
1299 call elsi_get_opts()
1300
1301 ! Number of states to solve when calling an eigensolver
1302 n_state = min(n_basis, n_basis/2+5)
1303
1304 ! Now we have all ingredients to initialize ELSI
1305 call elsi_init(elsi_h, which_solver, MULTI_PROC, SIESTA_CSC, n_basis, &
1306 qtot, n_state)
1307
1308 ! Output
1309 call elsi_set_output(elsi_h, out_level)
1310 call elsi_set_output_log(elsi_h, out_json)
1311 call elsi_set_write_unit(elsi_h, 6)
1312
1313 ! Broadening
1314 call elsi_set_mu_broaden_scheme(elsi_h, which_broad)
1315 call elsi_set_mu_broaden_width(elsi_h, temp)
1316 call elsi_set_mu_mp_order(elsi_h, mp_order)
1317
1318 ! Solver settings
1319 call elsi_set_elpa_solver(elsi_h, elpa_flavor)
1320
1321 call elsi_set_omm_flavor(elsi_h, omm_flavor)
1322 call elsi_set_omm_n_elpa(elsi_h, omm_n_elpa)
1323 call elsi_set_omm_tol(elsi_h, omm_tol)
1324
1325 if (pexsi_tasks_per_pole /= ELSI_NOT_SET) then
1326 call elsi_set_pexsi_np_per_pole(elsi_h, pexsi_tasks_per_pole)
1327 end if
1328
1329 call elsi_set_pexsi_np_symbo(elsi_h, pexsi_tasks_symbolic)
1330 call elsi_set_pexsi_temp(elsi_h, temp)
1331! call elsi_set_pexsi_gap(elsi_h, gap)
1332! call elsi_set_pexsi_delta_e(elsi_h, delta_e)
1333 call elsi_set_pexsi_mu_min(elsi_h, -10.0_dp)
1334 call elsi_set_pexsi_mu_max(elsi_h, 10.0_dp)
1335
1336 if (sips_n_slice /= ELSI_NOT_SET) then
1337 call elsi_set_sips_n_slice(elsi_h, sips_n_slice)
1338 end if
1339
1340 call elsi_set_sips_n_elpa(elsi_h, sips_n_elpa)
1341 call elsi_set_sips_interval(elsi_h, -10.0_dp, 10.0_dp)
1342
1343 call elsi_set_ntpoly_method(elsi_h, ntpoly_method)
1344 call elsi_set_ntpoly_filter(elsi_h, ntpoly_filter)
1345 call elsi_set_ntpoly_tol(elsi_h, ntpoly_tol)
1346
1347 endif ! iscf == 1
1348
1349 !print *, global_rank, "| ", " Entering elsi_complex_solver"
1350
1351 if (n_spin == 1) then
1352
1353 ! Sparsity pattern
1354 call globalize_sum(nnz_l, nnz_g, comm=kpt_comm)
1355
1356 allocate(row_ptr2(n_basis_l+1))
1357 row_ptr2(1:n_basis_l) = row_ptr(1:n_basis_l)+1
1358 row_ptr2(n_basis_l+1) = nnz_l+1
1359
1360 call elsi_set_csc(elsi_h, nnz_g, nnz_l, n_basis_l, col_idx, row_ptr2)
1361 deallocate(row_ptr2)
1362
1363 call elsi_set_csc_blk(elsi_h, BlockSize)
1364 call elsi_set_kpoint(elsi_h, nkpnt, kpt_n, weight)
1365 call elsi_set_mpi(elsi_h, kpt_comm)
1366 call elsi_set_mpi_global(elsi_h, elsi_global_comm)
1367
1368 else
1369
1370 call mpi_comm_rank( elsi_global_Comm, global_rank, ierr )
1371
1372 ! MPI logic for spin polarization
1373
1374 ! Split the communicator in spins and get distribution objects
1375 ! for the data redistribution needed
1376 ! Note that dist_spin is an array
1377 call get_spin_comms_and_dists(kpt_comm,kpt_comm, & !! **** kpt_comm as global?
1378 blocksize, n_spin, &
1379 dist_global,dist_spin, elsi_spatial_comm, elsi_spin_comm)
1380
1381 ! Find out which spin team we are in, and tag the spin we work on
1382 call mpi_comm_rank( elsi_Spin_Comm, spin_rank, ierr )
1383 my_spin = spin_rank+1 ! {1,2}
1384
1385 !print *, global_rank, "| ", "spin ", my_spin, " After spin splitting"
1386
1387 ! This is done serially, each time filling one spin set
1388 ! Note that **all processes** need to have the same pkg_global
1389
1390 do ispin = 1, n_spin
1391
1392 ! Load pkg_global data package
1393 pkg_global%norbs = n_basis
1394 pkg_global%no_l = n_basis_l
1395 pkg_global%nnzl = nnz_l
1396 pkg_global%numcols => numh
1397 pkg_global%cols => col_idx
1398
1399 allocate(pkg_global%complex_vals(2))
1400 ! Link the vals items to the appropriate arrays (no extra memory here)
1401 pkg_global%complex_vals(1)%data => ovlp(:)
1402 ! Note that we *cannot* say => ham(:,my_spin)
1403 ! and avoid the sequential loop, as then half the processors will send
1404 ! the information for 'spin up' and the other half the information for 'spin down',
1405 ! which is *not* what we want.
1406 pkg_global%complex_vals(2)%data => ham(:,ispin)
1407
1408 call timer("redist_orbs_fwd", 1)
1409
1410 ! We are doing the transfers sequentially. One spin team is
1411 ! 'idle' (in the receiving side) in each pass, as the dist_spin(ispin) distribution
1412 ! does not involve them.
1413
1414 call redistribute_spmatrix(n_basis,pkg_global,dist_global, &
1415 pkg_spin,dist_spin(ispin),kpt_Comm)
1416
1417 call timer("redist_orbs_fwd", 2)
1418
1419 if (my_spin == ispin) then ! Each team gets their own data
1420
1421 !nrows = pkg_spin%norbs ! or simply 'norbs'
1422 my_no_l = pkg_spin%no_l
1423 my_nnz_l = pkg_spin%nnzl
1424 call MPI_AllReduce(my_nnz_l,my_nnz,1,MPI_integer,MPI_sum,elsi_Spatial_Comm,ierr)
1425 ! generate off-by-one row pointer
1426 call re_alloc(my_row_ptr2,1,my_no_l+1,"my_row_ptr2","elsi_solver")
1427 my_row_ptr2(1) = 1
1428 do ih = 1,my_no_l
1429 my_row_ptr2(ih+1) = my_row_ptr2(ih) + pkg_spin%numcols(ih)
1430 enddo
1431
1432 my_col_idx => pkg_spin%cols
1433 my_S => pkg_spin%complex_vals(1)%data
1434 my_H => pkg_spin%complex_vals(2)%data
1435
1436 call re_alloc(my_DM,1,my_nnz_l,"my_DM","elsi_solver")
1437 call re_alloc(my_EDM,1,my_nnz_l,"my_EDM","elsi_solver")
1438 endif
1439
1440 ! Clean pkg_global
1441 nullify(pkg_global%complex_vals(1)%data)
1442 nullify(pkg_global%complex_vals(2)%data)
1443 deallocate(pkg_global%complex_vals)
1444 nullify(pkg_global%numcols)
1445 nullify(pkg_global%cols)
1446
1447 enddo
1448
1449 !print *, global_rank, "| ", "spin ", my_spin, "Done spin transfers"
1450
1451 call elsi_set_csc(elsi_h, my_nnz, my_nnz_l, my_no_l, my_col_idx, my_row_ptr2)
1452 call de_alloc(my_row_ptr2,"my_row_ptr2","elsi_solver")
1453
1454 call elsi_set_csc_blk(elsi_h, BlockSize)
1455 call elsi_set_spin(elsi_h, n_spin, my_spin)
1456 call elsi_set_kpoint(elsi_h, nkpnt, kpt_n, weight)
1457 call elsi_set_mpi(elsi_h, elsi_Spatial_comm)
1458 call elsi_set_mpi_global(elsi_h, elsi_global_comm)
1459
1460 endif ! n_spin
1461
1462 call timer("elsi-solver", 1)
1463
1464 !print *, global_rank, "| ", "About to call elsi_dm"
1465 if (n_spin == 1) then
1466 if (.not.Get_EDM_Only) then
1467 call elsi_dm_complex_sparse(elsi_h, ham, ovlp, DM, energy)
1468 call elsi_get_entropy(elsi_h, ets)
1469 else
1470 call elsi_get_edm_complex_sparse(elsi_h, EDM)
1471 endif
1472 else
1473 ! Solve DM, and get (at every step for now) EDM, Fermi energy, and entropy
1474 ! Energy is already summed over spins
1475 if (.not.Get_EDM_Only) then
1476 call elsi_dm_complex_sparse(elsi_h, my_H, my_S, my_DM, energy)
1477 !... but we still need to sum the entropy over spins
1478 call elsi_get_entropy(elsi_h, ets_spin)
1479 else
1480 call elsi_get_edm_complex_sparse(elsi_h, my_EDM)
1481 endif
1482#ifdef SIESTA__ELSI__OLD_SPIN_CONVENTION
1483 ! NOTE**
1484 ! For ELSI versions before 2018-08-17 we still need to sum the entropy over spins
1485 ! ... but to figure out the version is not trivial, as the relevant routines changed...
1486 call globalize_sum(ets_spin, ets, comm=elsi_Spin_comm)
1487#else
1488 ! If this gives an error, then your version of ELSI is old, and you should
1489 ! be using the pre-processor symbol above
1490 ! (This works because 'elsi_get_datestamp' was added just
1491 ! around the same time as the spin-reduction-convention change) (Aug 17 vs Aug 21, 2018...)
1492 ! If you are unlucky enough to have an ELSI version in between, get rid of it.
1493 call elsi_get_datestamp(date_stamp)
1494#endif
1495 ! And over kpoints?? -- not necessary, ever
1496 endif
1497
1498 call elsi_get_mu(elsi_h, ef)
1499 ets = ets/temp
1500
1501 ! Ef, energy, and ets are known to all nodes
1502
1503 call timer("elsi-solver", 2)
1504 !print *, global_rank, "| ", "Done elsi_dm"
1505
1506 if ( n_spin == 2) then
1507 ! Now we need to redistribute back
1508
1509 do ispin = 1, n_spin
1510
1511 if (my_spin == ispin) then
1512 ! Prepare pkg_spin to transfer the right spin information
1513 ! The other fields (numcols, cols) are the same and are still there
1514 ! Deallocate my_S and my_H
1515 call de_alloc(pkg_spin%complex_vals(1)%data,"pkg_spin%vals(1)%data","elsi_solver")
1516 call de_alloc(pkg_spin%complex_vals(2)%data,"pkg_spin%vals(2)%data","elsi_solver")
1517
1518 pkg_spin%complex_vals(1)%data => my_DM(1:my_nnz_l)
1519 pkg_spin%complex_vals(2)%data => my_EDM(1:my_nnz_l)
1520
1521 endif
1522
1523 ! pkg_global is clean now
1524 call timer("redist_orbs_bck", 1)
1525 call redistribute_spmatrix(n_basis,pkg_spin,dist_spin(ispin) &
1526 ,pkg_global,dist_global,kpt_Comm)
1527 call timer("redist_orbs_bck", 2)
1528
1529 ! Clean pkg_spin
1530 if (my_spin == ispin) then
1531 ! Each team deallocates during "its" spin cycle
1532 call de_alloc(my_DM, "my_DM", "elsi_solver")
1533 call de_alloc(my_EDM,"my_EDM","elsi_solver")
1534
1535 nullify(pkg_spin%complex_vals(1)%data) ! formerly pointing to DM
1536 nullify(pkg_spin%complex_vals(2)%data) ! formerly pointing to EDM
1537 deallocate(pkg_spin%complex_vals)
1538 ! allocated in the direct transfer
1539 call de_alloc(pkg_spin%numcols,"pkg_spin%numcols","elsi_solver")
1540 call de_alloc(pkg_spin%cols, "pkg_spin%cols", "elsi_solver")
1541 endif
1542
1543
1544 ! In future, pkg_global%vals(1,2) could be pointing to DM and EDM,
1545 ! and the 'redistribute' routine check whether the vals arrays are
1546 ! associated, to use them instead of allocating them.
1547 DM(:,ispin) = pkg_global%complex_vals(1)%data(:)
1548 EDM(:,ispin) = pkg_global%complex_vals(2)%data(:)
1549 ! Check no_l
1550 if (n_basis_l /= pkg_global%no_l) then
1551 call die("Mismatch in no_l")
1552 endif
1553 ! Check listH
1554 if (any(col_idx(:) /= pkg_global%cols(:))) then
1555 call die("Mismatch in listH")
1556 endif
1557
1558 ! Clean pkg_global
1559 ! allocated by the transfer routine, but we did not actually
1560 ! look at them
1561 call de_alloc(pkg_global%numcols,"pkg_global%numcols","elsi_solver")
1562 call de_alloc(pkg_global%cols, "pkg_global%cols", "elsi_solver")
1563
1564 call de_alloc(pkg_global%complex_vals(1)%data,"pkg_global%vals(1)%data","elsi_solver")
1565 call de_alloc(pkg_global%complex_vals(2)%data,"pkg_global%vals(2)%data","elsi_solver")
1566 deallocate(pkg_global%complex_vals)
1567
1568 enddo
1569
1570 call MPI_Comm_Free(elsi_Spatial_comm, ierr)
1571 call MPI_Comm_Free(elsi_Spin_comm, ierr)
1572
1573 else ! n_spin == 1
1574
1575 ! Nothing else to do
1576 endif
1577
1578 call timer("elsi-complex-solver", 2)
1579
1580
1581end subroutine elsi_complex_solver
1582
1583subroutine transpose(a,b)
1584 real(dp), intent(in) :: a(:,:)
1585 real(dp), allocatable, intent(out) :: b(:,:)
1586
1587 integer n, m, i, j
1588
1589 n = size(a,1)
1590 m = size(a,2)
1591 allocate(b(m,n))
1592 do i = 1, n
1593 do j = 1, m
1594 b(j, i) = a(i, j)
1595 enddo
1596 enddo
1597end subroutine transpose
1598
1599# endif
1600
1601end module m_elsi_interface
01602
=== modified file 'Src/m_ncdf_siesta.F90'
--- Src/m_ncdf_siesta.F90 2019-01-04 07:01:47 +0000
+++ Src/m_ncdf_siesta.F90 2019-01-16 11:00:58 +0000
@@ -51,6 +51,7 @@
51 use siesta_options, only: SOLVE_DIAGON, SOLVE_ORDERN51 use siesta_options, only: SOLVE_DIAGON, SOLVE_ORDERN
52 use siesta_options, only: SOLVE_MINIM, SOLVE_TRANSI52 use siesta_options, only: SOLVE_MINIM, SOLVE_TRANSI
53 use siesta_options, only: SOLVE_PEXSI53 use siesta_options, only: SOLVE_PEXSI
54 use siesta_options, only: SOLVE_ELSI
54 use siesta_options, only: savehs55 use siesta_options, only: savehs
55 use siesta_options, only: fixspin, total_spin56 use siesta_options, only: fixspin, total_spin
56 use siesta_options, only: ia1, ia2, dx ! FC information57 use siesta_options, only: ia1, ia2, dx ! FC information
@@ -361,6 +362,8 @@
361 dic = dic//('method'.kv.'transiesta')362 dic = dic//('method'.kv.'transiesta')
362 else if ( isolve == SOLVE_PEXSI ) then363 else if ( isolve == SOLVE_PEXSI ) then
363 dic = dic//('method'.kv.'pexsi')364 dic = dic//('method'.kv.'pexsi')
365 else if ( isolve == SOLVE_ELSI ) then
366 dic = dic//('method'.kv.'elsi')
364 end if367 end if
365368
366 ! Attributes are collective369 ! Attributes are collective
367370
=== modified file 'Src/m_pexsi_dos.F90'
--- Src/m_pexsi_dos.F90 2016-08-04 09:03:32 +0000
+++ Src/m_pexsi_dos.F90 2019-01-16 11:00:58 +0000
@@ -99,7 +99,7 @@
99!99!
100! Our global communicator is a duplicate of the passed communicator100! Our global communicator is a duplicate of the passed communicator
101!101!
102call MPI_Comm_Dup(true_MPI_Comm_World, World_Comm, ierr)102call MPI_Comm_Dup(mpi_comm_dft, World_Comm, ierr)
103call mpi_comm_rank( World_Comm, mpirank, ierr )103call mpi_comm_rank( World_Comm, mpirank, ierr )
104104
105call timer("pexsi_dos", 1) 105call timer("pexsi_dos", 1)
106106
=== modified file 'Src/m_pexsi_driver.F90'
--- Src/m_pexsi_driver.F90 2016-08-04 09:03:32 +0000
+++ Src/m_pexsi_driver.F90 2019-01-16 11:00:58 +0000
@@ -123,7 +123,7 @@
123!123!
124! Our global communicator is a duplicate of the passed communicator124! Our global communicator is a duplicate of the passed communicator
125!125!
126call MPI_Comm_Dup(true_MPI_Comm_World, World_Comm, ierr)126call MPI_Comm_Dup(MPI_Comm_DFT, World_Comm, ierr)
127call mpi_comm_rank( World_Comm, mpirank, ierr )127call mpi_comm_rank( World_Comm, mpirank, ierr )
128128
129! NOTE: fdf calls will assign values to the whole processor set,129! NOTE: fdf calls will assign values to the whole processor set,
130130
=== modified file 'Src/m_pexsi_local_dos.F90'
--- Src/m_pexsi_local_dos.F90 2018-04-27 09:35:52 +0000
+++ Src/m_pexsi_local_dos.F90 2019-01-16 11:00:58 +0000
@@ -144,7 +144,7 @@
144 !144 !
145 ! Our global communicator is a duplicate of the passed communicator145 ! Our global communicator is a duplicate of the passed communicator
146 !146 !
147 call MPI_Comm_Dup(true_MPI_Comm_World, World_Comm, ierr)147 call MPI_Comm_Dup(MPI_Comm_DFT, World_Comm, ierr)
148 call mpi_comm_rank( World_Comm, mpirank, ierr )148 call mpi_comm_rank( World_Comm, mpirank, ierr )
149 149
150 call timer("pexsi-ldos", 1) 150 call timer("pexsi-ldos", 1)
151151
=== modified file 'Src/m_redist_spmatrix.F90'
--- Src/m_redist_spmatrix.F90 2016-09-14 09:35:16 +0000
+++ Src/m_redist_spmatrix.F90 2019-01-16 11:00:58 +0000
@@ -1,18 +1,44 @@
11!
2! --- Tangled code2! This module implements functionality to change the MPI parallel
3! distribution of a sparse matrix stored in the modified
4! block-compressed form used in Siesta.
5!
6! The immediate goal is to exchange matrices between Siesta and the
7! PEXSI library. Siesta uses a "block-cyclic" distribution over
8! N_Siesta processors, and PEXSI uses a simple "greedy" blocked
9! distribution on each pole-group, with npPerPole processors. The
10! code works in principle with any distribution (subject to some
11! conditions spelled out below). The source and target process groups
12! are arbitrary, and can overlap. In fact, in SIESTA-PEXSI they
13! *will* overlap, as otherwise we would need extra processors for the
14! Siesta operations. This overlap forces the code to re-invent much
15! of the logic involved in MPI intercommunicators, which cannot be
16! created for non-disjoint groups.
17
18! Written by Alberto Garcia
19
20! --- Tangled code (using Org mode in emacs)
21! Alas, the Org version is no longer maintained.
22!
3module m_redist_spmatrix23module m_redist_spmatrix
4#ifdef SIESTA__PEXSI24#if defined (SIESTA__PEXSI) || defined (SIESTA__ELSI)
5 implicit none25 implicit none
26 integer, parameter, private :: dp = selected_real_kind(10,100)
27
28 ! Some auxiliary derived types
29
6 type, public :: comm_t30 type, public :: comm_t
7 integer :: src, dst, i1, i2, nitems31 integer :: src, dst, i1, i2, nitems
8 end type comm_t32 end type comm_t
9 33
10 integer, parameter, private :: dp = selected_real_kind(10,100)
11
12 type, public :: dp_pointer34 type, public :: dp_pointer
13 ! boxed array pointer type35 ! boxed array pointer type
14 real(dp), pointer :: data(:) => null()36 real(dp), pointer :: data(:) => null()
15 end type dp_pointer37 end type dp_pointer
38 type, public :: complex_dp_pointer
39 ! boxed array pointer type
40 complex(dp), pointer :: data(:) => null()
41 end type complex_dp_pointer
16 42
17 type, public :: aux_matrix43 type, public :: aux_matrix
18 integer :: norbs = -144 integer :: norbs = -1
@@ -20,14 +46,21 @@
20 integer :: nnzl = -146 integer :: nnzl = -1
21 integer, pointer :: numcols(:) => null()47 integer, pointer :: numcols(:) => null()
22 integer, pointer :: cols(:) => null()48 integer, pointer :: cols(:) => null()
23 ! array of 1D pointers49 ! arrays of 1D pointers
24 type(dp_pointer), dimension(:), pointer :: vals(:) => null()50 type(dp_pointer), dimension(:), pointer :: vals(:) => null()
51 type(complex_dp_pointer), dimension(:), pointer :: complex_vals(:) => null()
25 end type aux_matrix52 end type aux_matrix
53
54
26 public :: redistribute_spmatrix55 public :: redistribute_spmatrix
56
27CONTAINS57CONTAINS
58
28 subroutine redistribute_spmatrix(norbs,m1,dist1,m2,dist2,bridge_comm)59 subroutine redistribute_spmatrix(norbs,m1,dist1,m2,dist2,bridge_comm)
29 60
30 use mpi61 use mpi ! Note that this is fine, as we do not use any Siesta-specific
62 ! parameter, such as a modified mpi_comm_world
63
31 use class_Distribution64 use class_Distribution
32 use alloc, only: re_alloc, de_alloc65 use alloc, only: re_alloc, de_alloc
33 66
@@ -47,10 +80,11 @@
47 logical :: proc_in_set1, proc_in_set280 logical :: proc_in_set1, proc_in_set2
48 integer :: ierr81 integer :: ierr
49 82
50 integer :: i, io, g1, g2, j, nvals83 integer :: i, io, g1, g2, j, nvals, nvals_complex
51 integer :: comparison, n1, n2, c1, c284 integer :: comparison, n1, n2, c1, c2
52 integer, parameter :: dp = selected_real_kind(10,100)85 integer, parameter :: dp = selected_real_kind(10,100)
53 real(dp), dimension(:), pointer :: data1 => null(), data2 => null()86 real(dp), dimension(:), pointer :: data1 => null(), data2 => null()
87 complex(dp), dimension(:), pointer :: cdata1 => null(), cdata2 => null()
54 88
55 integer, allocatable :: ranks1(:), ranks2(:)89 integer, allocatable :: ranks1(:), ranks2(:)
56 90
@@ -109,10 +143,10 @@
109 proc_in_set1 = (myrank1 /= MPI_UNDEFINED)143 proc_in_set1 = (myrank1 /= MPI_UNDEFINED)
110 proc_in_set2 = (myrank2 /= MPI_UNDEFINED)144 proc_in_set2 = (myrank2 /= MPI_UNDEFINED)
111 145
112 if (proc_in_set1 .or. proc_in_set2) then146! if (proc_in_set1 .or. proc_in_set2) then
113 print "(a,3i6,2l2)", "world_rank, rank1, rank2, ing1?, ing2?", myid, &147! print "(a,3i6,2l2)", "world_rank, rank1, rank2, ing1?, ing2?", myid, &
114 myrank1, myrank2, proc_in_set1, proc_in_set2148! myrank1, myrank2, proc_in_set1, proc_in_set2
115 endif149! endif
116 150
117 ! Figure out the communication needs151 ! Figure out the communication needs
118 call analyze_comms()152 call analyze_comms()
@@ -130,10 +164,10 @@
130 call re_alloc(m2%numcols,1,m2%no_l,"m2%numcols","redistribute_spmatrix")164 call re_alloc(m2%numcols,1,m2%no_l,"m2%numcols","redistribute_spmatrix")
131 endif165 endif
132 166
133 if (myid == 0) print *, "About to transfer numcols..."167! if (myid == 0) print *, "About to transfer numcols..."
134 call do_transfers_int(comms,m1%numcols,m2%numcols, &168 call do_transfers_int(comms,m1%numcols,m2%numcols, &
135 g1,g2,bridge_comm)169 g1,g2,bridge_comm)
136 if (myid == 0) print *, "Transferred numcols."170! if (myid == 0) print *, "Transferred numcols."
137 171
138 ! We need to tell the processes in set 2 how many172 ! We need to tell the processes in set 2 how many
139 ! "vals" to expect.173 ! "vals" to expect.
@@ -143,12 +177,18 @@
143 else177 else
144 nvals = 0178 nvals = 0
145 endif179 endif
180 if (associated(m1%complex_vals)) then
181 nvals_complex = size(m1%complex_vals)
182 else
183 nvals_complex = 0
184 endif
146 endif185 endif
147 ! Now do a broadcast within bridge_comm, using as root one186 ! Now do a broadcast within bridge_comm, using as root one
148 ! process in the first set. Let's say the one with rank 0187 ! process in the first set. Let's say the one with rank 0
149 ! in g1, the first in the set, which will have rank=ranks1(1)188 ! in g1, the first in the set, which will have rank=ranks1(1)
150 ! in bridge_comm189 ! in bridge_comm
151 call MPI_Bcast(nvals,1,MPI_Integer,ranks1(1),bridge_comm,ierr)190 call MPI_Bcast(nvals,1,MPI_Integer,ranks1(1),bridge_comm,ierr)
191 call MPI_Bcast(nvals_complex,1,MPI_Integer,ranks1(1),bridge_comm,ierr)
152 192
153 ! Now we can figure out how many non-zeros there are193 ! Now we can figure out how many non-zeros there are
154 if (proc_in_set2) then194 if (proc_in_set2) then
@@ -163,6 +203,14 @@
163 enddo203 enddo
164 endif204 endif
165 205
206 if (nvals_complex > 0) then
207 allocate(m2%complex_vals(nvals_complex))
208 do j=1,nvals_complex
209 call re_alloc(m2%complex_vals(j)%data,1,m2%nnzl, &
210 "m2%complex_vals(j)%data","redistribute_spmatrix")
211 enddo
212 endif
213
166 endif214 endif
167 215
168 ! Generate a new comms-structure with new start/count indexes216 ! Generate a new comms-structure with new start/count indexes
@@ -202,12 +250,12 @@
202 !!$ endif250 !!$ endif
203 !!$ enddo251 !!$ enddo
204 252
205 if (myid == 0) print *, "About to transfer cols..."253! if (myid == 0) print *, "About to transfer cols..."
206 ! Transfer the cols arrays254 ! Transfer the cols arrays
207 call do_transfers_int(commsnnz,m1%cols,m2%cols, &255 call do_transfers_int(commsnnz,m1%cols,m2%cols, &
208 g1, g2, bridge_comm)256 g1, g2, bridge_comm)
209 257
210 if (myid == 0) print *, "About to transfer values..."258! if (myid == 0) print *, "About to transfer values..."
211 ! Transfer the values arrays259 ! Transfer the values arrays
212 do j=1, nvals260 do j=1, nvals
213 if (proc_in_set1) data1 => m1%vals(j)%data261 if (proc_in_set1) data1 => m1%vals(j)%data
@@ -216,7 +264,17 @@
216 g1,g2,bridge_comm)264 g1,g2,bridge_comm)
217 enddo265 enddo
218 nullify(data1,data2)266 nullify(data1,data2)
219 if (myid == 0) print *, "Done transfers."267
268! if (myid == 0) print *, "About to transfer complex values..."
269 ! Transfer the values arrays
270 do j=1, nvals_complex
271 if (proc_in_set1) cdata1 => m1%complex_vals(j)%data
272 if (proc_in_set2) cdata2 => m2%complex_vals(j)%data
273 call do_transfers_complex_dp(commsnnz,cdata1,cdata2, &
274 g1,g2,bridge_comm)
275 enddo
276 nullify(cdata1,cdata2)
277! if (myid == 0) print *, "Done transfers."
220 278
221 deallocate(commsnnz)279 deallocate(commsnnz)
222 deallocate(comms)280 deallocate(comms)
@@ -312,9 +370,9 @@
312 if (myid == 0 .and. comms_not_printed) then370 if (myid == 0 .and. comms_not_printed) then
313 do i = 1, ncomms371 do i = 1, ncomms
314 c => comms(i)372 c => comms(i)
315 write(6,"(a,i5,a,2i5,2i7,i5)") &373! write(6,"(a,i5,a,2i5,2i7,i5)") &
316 "comm: ", i, " src, dst, i1, i2, n:", &374! "comm: ", i, " src, dst, i1, i2, n:", &
317 c%src, c%dst, c%i1, c%i2, c%nitems375! c%src, c%dst, c%i1, c%i2, c%nitems
318 enddo376 enddo
319 comms_not_printed = .false.377 comms_not_printed = .false.
320 endif378 endif
@@ -543,6 +601,114 @@
543 deallocate(local_reqR, local_reqS, statuses)601 deallocate(local_reqR, local_reqS, statuses)
544 602
545 end subroutine do_transfers_dp603 end subroutine do_transfers_dp
604 !--------------------------------------------------
605 subroutine do_transfers_complex_dp(comms,data1,data2,g1,g2,bridge_comm)
606
607 use mpi
608 integer, parameter :: dp = selected_real_kind(10,100)
609
610 type(comm_t), intent(in), target :: comms(:)
611 complex(dp), dimension(:), pointer :: data1
612 complex(dp), dimension(:), pointer :: data2
613 integer, intent(in) :: g1
614 integer, intent(in) :: g2
615 integer, intent(in) :: bridge_comm
616
617 integer :: basegroup, nsize1, nsize2, ierr
618 integer, allocatable :: comm_rank1(:), comm_rank2(:)
619
620
621 integer :: ncomms
622 integer :: i
623 integer :: nrecvs_local, nsends_local
624 integer, allocatable :: statuses(:,:), local_reqR(:), local_reqS(:)
625 integer :: src_in_comm, dst_in_comm
626 integer :: myrank1, myrank2, myid
627 type(comm_t), pointer :: c
628
629 call MPI_Comm_Rank( bridge_comm, myid, ierr )
630 ! print *, "Entering transfer_complex_dp"
631 ! print *, "rank, Associated data1: ", myid, associated(data1)
632 ! print *, "rank, Associated data2: ", myid, associated(data2)
633
634 ! Find the rank correspondences, in case
635 ! there is implicit renumbering at the time of group creation
636
637 call MPI_Comm_group( bridge_comm, basegroup, ierr )
638 call MPI_Group_Size( g1, nsize1, ierr )
639 call MPI_Group_Size( g2, nsize2, ierr )
640 allocate(comm_rank1(0:nsize1-1))
641 call MPI_Group_translate_ranks( g1, nsize1, (/ (i,i=0,nsize1-1) /), &
642 basegroup, comm_rank1, ierr )
643 ! print "(a,10i3)", "Ranks of g1 in base group:", comm_rank1
644 allocate(comm_rank2(0:nsize2-1))
645 call MPI_Group_translate_ranks( g2, nsize2, (/ (i,i=0,nsize2-1) /), &
646 basegroup, comm_rank2, ierr )
647 ! print "(a,10i3)", "Ranks of g2 in base group:", comm_rank2
648
649 call mpi_group_rank(g1,myrank1,ierr)
650 call mpi_group_rank(g2,myrank2,ierr)
651
652 ! Do the actual transfers.
653 ! This version with non-blocking communications
654
655 ncomms = size(comms)
656
657 ! Some bookkeeping for the requests
658 nrecvs_local = 0
659 nsends_local = 0
660 do i=1,ncomms
661 c => comms(i)
662 if (myrank2 == c%dst) then
663 nrecvs_local = nrecvs_local + 1
664 endif
665 if (myrank1 == c%src) then
666 nsends_local = nsends_local + 1
667 endif
668 enddo
669 allocate(local_reqR(nrecvs_local))
670 allocate(local_reqS(nsends_local))
671 allocate(statuses(mpi_status_size,nrecvs_local))
672
673 ! First, post the receives
674 nrecvs_local = 0
675 do i=1,ncomms
676 c => comms(i)
677 if (myrank2 == c%dst) then
678 nrecvs_local = nrecvs_local + 1
679 src_in_comm = comm_rank1(c%src)
680 call MPI_irecv(data2(c%i2),c%nitems,MPI_Double_Complex,src_in_comm, &
681 i,bridge_comm,local_reqR(nrecvs_local),ierr)
682 endif
683 enddo
684
685 ! Post the sends
686 nsends_local = 0
687 do i=1,ncomms
688 c => comms(i)
689 if (myrank1 == c%src) then
690 nsends_local = nsends_local + 1
691 dst_in_comm = comm_rank2(c%dst)
692 call MPI_isend(data1(c%i1),c%nitems,MPI_Double_Complex,dst_in_comm, &
693 i,bridge_comm,local_reqS(nsends_local),ierr)
694 endif
695 enddo
696
697 ! A former loop of waits can be substituted by a "waitall",
698 ! with every processor keeping track of the actual number of
699 ! requests in which it is involved.
700
701 ! Should we wait also on the sends?
702
703 call MPI_waitall(nrecvs_local, local_reqR, statuses, ierr)
704
705
706 ! This barrier is needed, I think
707 call MPI_Barrier(bridge_comm,ierr)
708
709 deallocate(local_reqR, local_reqS, statuses)
710
711 end subroutine do_transfers_complex_dp
546#endif712#endif
547end module m_redist_spmatrix713end module m_redist_spmatrix
548! --- End of tangled code714! --- End of tangled code
549715
=== modified file 'Src/parallel.F'
--- Src/parallel.F 2017-07-14 12:17:28 +0000
+++ Src/parallel.F 2019-01-16 11:00:58 +0000
@@ -1,5 +1,5 @@
1! ---1! ---
2! Copyright (C) 1996-2016 The SIESTA group2! Copyright- (C) 1996-2016 The SIESTA group
3! This file is distributed under the terms of the3! This file is distributed under the terms of the
4! GNU General Public License: see COPYING in the top directory4! GNU General Public License: see COPYING in the top directory
5! or http://www.gnu.org/copyleft/gpl.txt .5! or http://www.gnu.org/copyleft/gpl.txt .
66
=== modified file 'Src/post_scf_work.F'
--- Src/post_scf_work.F 2018-09-12 13:49:21 +0000
+++ Src/post_scf_work.F 2019-01-16 11:00:58 +0000
@@ -27,12 +27,13 @@
27 use class_Fstack_Pair_Geometry_dSpData2D, only: new, max_size27 use class_Fstack_Pair_Geometry_dSpData2D, only: new, max_size
28 use class_Fstack_Pair_Geometry_dSpData2D, only: push28 use class_Fstack_Pair_Geometry_dSpData2D, only: push
2929
30 use parallel, only: ionode30 use parallel, only : IOnode, SIESTA_worker
31
31 use atomlist, only: lasto, rmaxo, datm, indxuo, no_s, no_u,32 use atomlist, only: lasto, rmaxo, datm, indxuo, no_s, no_u,
32 & iphorb, no_l, qtot, qtots33 & iphorb, no_l, qtot, qtots
33 use m_energies34 use m_energies
34 use neighbour, only : maxna=>maxnna ! For plcharge...35 use neighbour, only : maxna=>maxnna ! For plcharge...
35 use m_spin, only : nspin, h_spin_dim36 use m_spin, only : nspin, h_spin_dim, spin
36 use m_spin, only : spinor_dim 37 use m_spin, only : spinor_dim
37 use m_diagon, only : diagon 38 use m_diagon, only : diagon
38 use m_dminim, only : dminim39 use m_dminim, only : dminim
@@ -41,6 +42,9 @@
41 use m_compute_dm, only : PreviousCallDiagon42 use m_compute_dm, only : PreviousCallDiagon
42 use m_eo43 use m_eo
43 use kpoint_scf_m, only: kpoint_scf, gamma_scf44 use kpoint_scf_m, only: kpoint_scf, gamma_scf
45#ifdef SIESTA__ELSI
46 use m_elsi_interface, only: elsi_getdm
47#endif
44 implicit none48 implicit none
4549
46 ! MD-step, SCF-step50 ! MD-step, SCF-step
@@ -51,10 +55,37 @@
51 type(Pair_Geometry_dSpData2D) :: pair55 type(Pair_Geometry_dSpData2D) :: pair
52 type(Geometry) :: geom56 type(Geometry) :: geom
5357
58 real(dp), allocatable :: Dscf_dummy(:,:)
59 real(dp) :: ef_dummy
60 real(dp) :: Entropy_dummy
61
54 call timer( 'PostSCF', 1 )62 call timer( 'PostSCF', 1 )
5563
56! If converged, make one last iteration to find forces and stress64! If converged, make one last iteration to find forces and stress
5765
66! If we use the ELSI interface, the energy-density
67! matrix is not calculated inside the SCF step, and
68! so this must be done now
69
70#ifdef SIESTA__ELSI
71 if (isolve .eq. SOLVE_ELSI) then
72 allocate(Dscf_dummy(maxnh,spin%DM))
73 ! Note that we do not care about the content of H and S
74 ! Only that the appropriate elsi_h handle is still active
75 ! and allows us to get the appropriate EDM
76 call elsi_getdm(iscf, no_s, spin%spinor,
77 & no_l, maxnh, no_u,
78 & numh, listhptr, listh,
79 & H, S, qtot, temp,
80 & xijo,
81 & kpoint_scf%N, kpoint_scf%k, kpoint_scf%w,
82 & eo, qo, Dscf_dummy, Escf, ef_dummy, Entropy_dummy,
83 & occtol, neigwanted, Get_EDM_Only=.true.)
84 deallocate(Dscf_dummy)
85 endif
86 if (.not. SIESTA_worker) RETURN
87#endif
88
58! If we use the minimization routine, the energy-density89! If we use the minimization routine, the energy-density
59! matrix is not calculated inside the SCF step, and90! matrix is not calculated inside the SCF step, and
60! so this must be done now91! so this must be done now
6192
=== modified file 'Src/read_options.F90'
--- Src/read_options.F90 2018-12-02 00:15:31 +0000
+++ Src/read_options.F90 2019-01-16 11:00:58 +0000
@@ -76,6 +76,7 @@
76 ! 4 = PEXSI76 ! 4 = PEXSI
77 ! 5 = (Matrix write)77 ! 5 = (Matrix write)
78 ! 6 = CheSS78 ! 6 = CheSS
79 ! 7 = ELSI
79 ! real*8 temp : Temperature for Fermi smearing (Ry)80 ! real*8 temp : Temperature for Fermi smearing (Ry)
80 ! logical fixspin : Fix the spin of the system?81 ! logical fixspin : Fix the spin of the system?
81 ! real*8 total_spin : Total spin of the system82 ! real*8 total_spin : Total spin of the system
@@ -706,6 +707,16 @@
706#else707#else
707 call die("PEXSI solver is not compiled in. Use -DSIESTA__PEXSI")708 call die("PEXSI solver is not compiled in. Use -DSIESTA__PEXSI")
708#endif709#endif
710 else if (leqi(method,"elsi")) then
711#ifdef SIESTA__ELSI
712 isolve = SOLVE_ELSI
713 if (ionode) then
714 call add_citation("ELSI PAPER***")
715 write(*,3) 'redata: Method of Calculation', 'ELSI'
716 endif
717#else
718 call die("ELSI solver is not compiled in. Use -DSIESTA__ELSI")
719#endif
709720
710#ifdef SIESTA__CHESS721#ifdef SIESTA__CHESS
711 else if (leqi(method,'chess')) then722 else if (leqi(method,'chess')) then
712723
=== modified file 'Src/siesta.F'
--- Src/siesta.F 2018-05-30 09:15:30 +0000
+++ Src/siesta.F 2019-01-16 11:00:58 +0000
@@ -21,7 +21,7 @@
21 use parallel, only: SIESTA_worker ! whether part of Siesta's communicator21 use parallel, only: SIESTA_worker ! whether part of Siesta's communicator
22 use parallel, only: ionode22 use parallel, only: ionode
23#ifdef MPI 23#ifdef MPI
24 use mpi_siesta, only: true_mpi_comm_world24 use mpi_siesta, only: mpi_comm_dft ! Includes Solver nodes
25#endif 25#endif
26 use m_mpi_utils, only: broadcast26 use m_mpi_utils, only: broadcast
2727
@@ -35,12 +35,12 @@
35 integer :: istep35 integer :: istep
36 logical :: relaxd36 logical :: relaxd
3737
38! Notes for PEXSI operation38! Notes for PEXSI and ELSI operation (extra procs for Solver)
39!39!
40! A subset of nodes carries out non-PEXSI Siesta operations 40! A subset of nodes carries out non-Solver Siesta operations
41! (i.e., setting up H, moving atoms, diagonalizing...). 41! (i.e., setting up H, moving atoms, diagonalizing...).
42! These are tagged as "SIESTA_worker" (admittedly, a bad name)42! These are tagged as "SIESTA_worker" (admittedly, a bad name)
43! All nodes are involved in the PEXSI electronic-structure solver,43! All nodes are involved in the (PEXSI or ELSI) electronic-structure solver,
44! and in the new LocalDOS computation based on selected inversion.44! and in the new LocalDOS computation based on selected inversion.
45!45!
46! 'siesta_init', 'siesta_forces', and 'siesta_analysis' need to46! 'siesta_init', 'siesta_forces', and 'siesta_analysis' need to
@@ -66,11 +66,11 @@
66C Begin of coordinate relaxation iteration66C Begin of coordinate relaxation iteration
67 relaxd = .false.67 relaxd = .false.
6868
69#ifdef SIESTA__PEXSI 69#if defined (SIESTA__PEXSI) || defined (SIESTA__ELSI)
70 ! Broadcast relevant things for program logic70 ! Broadcast relevant things for program logic
71 ! These were set in siesta_options, called only by "SIESTA_workers".71 ! These were set in siesta_options, called only by "SIESTA_workers".
72 call broadcast(inicoor,comm=true_MPI_Comm_World)72 call broadcast(inicoor,comm=mpi_comm_dft)
73 call broadcast(fincoor,comm=true_MPI_Comm_World)73 call broadcast(fincoor,comm=mpi_comm_dft)
74#endif74#endif
75 istep = inicoor75 istep = inicoor
76 DO WHILE ((istep.le.fincoor) .AND. (.not. relaxd))76 DO WHILE ((istep.le.fincoor) .AND. (.not. relaxd))
@@ -82,8 +82,8 @@
82 end if82 end if
8383
84 if (SIESTA_worker) call siesta_move( istep, relaxd )84 if (SIESTA_worker) call siesta_move( istep, relaxd )
85#ifdef SIESTA__PEXSI 85#if defined (SIESTA__PEXSI) || defined (SIESTA__ELSI)
86 call broadcast(relaxd,comm=true_MPI_Comm_World)86 call broadcast(relaxd,comm=mpi_comm_dft)
87#endif87#endif
88 if (.not. relaxd) then88 if (.not. relaxd) then
89 istep = istep + 189 istep = istep + 1
9090
=== modified file 'Src/siesta_forces.F90'
--- Src/siesta_forces.F90 2018-09-24 07:34:04 +0000
+++ Src/siesta_forces.F90 2019-01-16 11:00:58 +0000
@@ -84,6 +84,9 @@
84#ifdef SIESTA__PEXSI84#ifdef SIESTA__PEXSI
85 use m_pexsi, only: pexsi_finalize_scfloop85 use m_pexsi, only: pexsi_finalize_scfloop
86#endif86#endif
87#ifdef SIESTA__ELSI
88 use m_elsi_interface, only: elsi_finalize_scfloop
89#endif
87 use m_check_walltime90 use m_check_walltime
8891
89 use m_energies, only: DE_NEGF92 use m_energies, only: DE_NEGF
@@ -140,10 +143,10 @@
140 call write_debug( ' PRE siesta_forces' )143 call write_debug( ' PRE siesta_forces' )
141#endif144#endif
142145
143#ifdef SIESTA__PEXSI146#if defined (SIESTA__PEXSI) || defined (SIESTA__ELSI)
144 ! Broadcast relevant things for program logic147 ! Broadcast relevant things for program logic
145 ! These were set in read_options, called only by "SIESTA_workers".148 ! These were set in read_options, called only by "SIESTA_workers".
146 call broadcast(nscf, comm=true_MPI_Comm_World)149 call broadcast(nscf, comm=mpi_comm_dft)
147#endif150#endif
148151
149 if ( SIESTA_worker ) then152 if ( SIESTA_worker ) then
@@ -484,9 +487,9 @@
484 487
485 end if488 end if
486489
487#ifdef SIESTA__PEXSI490#if defined (SIESTA__PEXSI) || defined (SIESTA__ELSI)
488 call broadcast(iscf, comm=true_MPI_Comm_World)491 call broadcast(iscf, comm=mpi_comm_dft)
489 call broadcast(SCFconverged, comm=true_MPI_Comm_World)492 call broadcast(SCFconverged, comm=mpi_comm_dft)
490#endif493#endif
491494
492 ! Exit if converged495 ! Exit if converged
@@ -500,9 +503,15 @@
500 end if503 end if
501#endif504#endif
502505
506<<<<<<< TREE
503 if ( .not. SIESTA_worker ) return507 if ( .not. SIESTA_worker ) return
504508
505 call end_of_cycle_save_operations(SCFconverged)509 call end_of_cycle_save_operations(SCFconverged)
510=======
511 if ( SIESTA_worker ) then
512!===
513 call end_of_cycle_save_operations()
514>>>>>>> MERGE-SOURCE
506515
507 if ( .not. SCFconverged ) then516 if ( .not. SCFconverged ) then
508 if ( SCFMustConverge ) then517 if ( SCFMustConverge ) then
@@ -538,11 +547,18 @@
538 ! Clean-up here to limit memory usage547 ! Clean-up here to limit memory usage
539 call mixers_scf_history_init( )548 call mixers_scf_history_init( )
540 549
550!===
551 endif ! Siesta_Worker
552
541 ! End of standard SCF loop.553 ! End of standard SCF loop.
542 ! Do one more pass to compute forces and stresses554 ! Do one more pass to compute forces and stresses
543 555
544 ! Note that this call will no longer overwrite H while computing the556 ! Note that this call will no longer overwrite H while computing the
545 ! final energies, forces and stresses...557 ! final energies, forces and stresses...
558
559 ! If we want to preserve the "Siesta_Worker" subset implementation for ELSI,
560 ! this block needs to be executed by everybody
561 ! as it contains a call to the ELSI interface to get the EDM
546 562
547 if ( fdf_get("compute-forces",.true.) ) then563 if ( fdf_get("compute-forces",.true.) ) then
548 call post_scf_work( istep, iscf , SCFconverged )564 call post_scf_work( istep, iscf , SCFconverged )
@@ -550,6 +566,15 @@
550 if (ionode) call memory_snapshot("after post_scf_work")566 if (ionode) call memory_snapshot("after post_scf_work")
551#endif567#endif
552 end if568 end if
569
570#ifdef SIESTA__ELSI
571 ! Recall that there could be an extra call after the scf loop to get the EDM
572 if ( isolve == SOLVE_ELSI ) then
573 call elsi_finalize_scfloop()
574 end if
575#endif
576
577 if (.not. Siesta_Worker) RETURN
553 578
554 ! ... so H at this point is the latest generator of the DM, except579 ! ... so H at this point is the latest generator of the DM, except
555 ! if mixing H beyond self-consistency or terminating the scf loop580 ! if mixing H beyond self-consistency or terminating the scf loop
556581
=== modified file 'Src/siesta_init.F'
--- Src/siesta_init.F 2018-11-27 14:28:55 +0000
+++ Src/siesta_init.F 2019-01-16 11:00:58 +0000
@@ -267,8 +267,11 @@
267 call reinit( sname )267 call reinit( sname )
268268
269#ifdef MPI269#ifdef MPI
270
271#ifdef SIESTA__PEXSI
270 ! Optionally, restrict the number of processors that SIESTA uses, out272 ! Optionally, restrict the number of processors that SIESTA uses, out
271 ! of the total pool273 ! of the total pool (but only for PEXSI, not yet for other solvers)
274
272 npSIESTA = fdf_get("MPI.Nprocs.SIESTA",Nodes)275 npSIESTA = fdf_get("MPI.Nprocs.SIESTA",Nodes)
273 call MPI_Comm_Group(MPI_Comm_World, World_Group, MPIerror)276 call MPI_Comm_Group(MPI_Comm_World, World_Group, MPIerror)
274 call MPI_Group_incl(World_Group, npSIESTA,277 call MPI_Group_incl(World_Group, npSIESTA,
@@ -278,8 +281,16 @@
278 $ SIESTA_Comm, MPIerror)281 $ SIESTA_Comm, MPIerror)
279282
280 SIESTA_worker = (Node < npSIESTA)283 SIESTA_worker = (Node < npSIESTA)
281284#else
282 ! Swap communicator285 SIESTA_Comm = MPI_Comm_World
286 SIESTA_worker = .true.
287#endif
288 ! Swap communicators
289 ! This is needed in case that the whole program is run with
290 ! a subset of the global processes. In that case, the DFT instance
291 ! communicator (a better name than 'true world' communicator)
292 ! is at this point MPI_COMM_WORLD
293 MPI_COMM_DFT = MPI_COMM_WORLD
283 ! This is needed since MPI_COMM_WORLD is used implicitly by294 ! This is needed since MPI_COMM_WORLD is used implicitly by
284 ! Siesta for all operations. 295 ! Siesta for all operations.
285 MPI_COMM_WORLD = SIESTA_Comm296 MPI_COMM_WORLD = SIESTA_Comm
@@ -287,7 +298,7 @@
287 call MPI_Comm_Rank( MPI_Comm_World, Node, MPIerror )298 call MPI_Comm_Rank( MPI_Comm_World, Node, MPIerror )
288 call MPI_Comm_Size( MPI_Comm_World, Nodes, MPIerror )299 call MPI_Comm_Size( MPI_Comm_World, Nodes, MPIerror )
289 endif300 endif
290#else301#else /* not MPI */
291 SIESTA_worker = .true.302 SIESTA_worker = .true.
292 Node = 0303 Node = 0
293 Nodes = 1304 Nodes = 1
@@ -586,7 +597,14 @@
586! auxiliary supercell for the calculation of matrix elements.597! auxiliary supercell for the calculation of matrix elements.
587 call setup_kpoint_scf( ucell )598 call setup_kpoint_scf( ucell )
588 not_using_auxcell = gamma_scf599 not_using_auxcell = gamma_scf
589600#ifdef SIESTA__ELSI
601 if (isolve == SOLVE_ELSI) then
602 if (mod(Nodes,kpoint_scf%N*spin%spinor) /=0) then
603 call die("The number of MPI processes " //
604 $ "must be a multiple of nk*nspin")
605 endif
606 endif
607#endif
590! Call initialisation of PDOS here since we need to check if 608! Call initialisation of PDOS here since we need to check if
591! the auxiliary supercell is needed for a non-gamma calculation609! the auxiliary supercell is needed for a non-gamma calculation
592 call init_projected_DOS( ucell )610 call init_projected_DOS( ucell )
593611
=== modified file 'Src/siesta_options.F90'
--- Src/siesta_options.F90 2018-09-12 13:49:21 +0000
+++ Src/siesta_options.F90 2019-01-16 11:00:58 +0000
@@ -246,6 +246,7 @@
246 integer, parameter :: SOLVE_PEXSI = 4246 integer, parameter :: SOLVE_PEXSI = 4
247 integer, parameter :: MATRIX_WRITE = 5247 integer, parameter :: MATRIX_WRITE = 5
248 integer, parameter :: SOLVE_CHESS = 6248 integer, parameter :: SOLVE_CHESS = 6
249 integer, parameter :: SOLVE_ELSI = 7
249 250
250#ifdef SIESTA__FLOOK251#ifdef SIESTA__FLOOK
251 ! LUA-handle252 ! LUA-handle
252253
=== modified file 'Src/siesta_tddft.F90'
--- Src/siesta_tddft.F90 2018-06-26 17:09:38 +0000
+++ Src/siesta_tddft.F90 2019-01-16 11:00:58 +0000
@@ -61,7 +61,7 @@
61#ifdef SIESTA__PEXSI61#ifdef SIESTA__PEXSI
62 ! Broadcast relevant things for program logic62 ! Broadcast relevant things for program logic
63 ! These were set in read_options, called only by "SIESTA_workers".63 ! These were set in read_options, called only by "SIESTA_workers".
64 call broadcast(nscf,comm=true_MPI_Comm_World)64 call broadcast(nscf,comm=MPI_Comm_DFT)
65#endif65#endif
6666
67 if ( SIESTA_worker ) then67 if ( SIESTA_worker ) then
6868
=== modified file 'Src/state_init.F'
--- Src/state_init.F 2018-10-23 09:46:30 +0000
+++ Src/state_init.F 2019-01-16 11:00:58 +0000
@@ -811,6 +811,10 @@
811#endif811#endif
812 call timer( 'state_init', 2 )812 call timer( 'state_init', 2 )
813813
814! Call BLACS setup here if needed (all solvers except PEXSI)
815! Setup S in 2d Scalapack distribution if needed (ELSI ELPA solver)
816! Initialize ELSI
817
814 END subroutine state_init818 END subroutine state_init
815819
816 subroutine check_cohp()820 subroutine check_cohp()
817821
=== modified file 'Src/write_subs.F'
--- Src/write_subs.F 2018-06-04 14:08:20 +0000
+++ Src/write_subs.F 2019-01-16 11:00:58 +0000
@@ -441,6 +441,7 @@
441 write(6,"(/a,f14.6,/)") 'siesta: Eharris(eV) = ', Eharrs/eV441 write(6,"(/a,f14.6,/)") 'siesta: Eharris(eV) = ', Eharrs/eV
442! No need for further cml output442! No need for further cml output
443 elseif ((isolve==SOLVE_DIAGON) .or. (isolve==SOLVE_PEXSI)443 elseif ((isolve==SOLVE_DIAGON) .or. (isolve==SOLVE_PEXSI)
444 . .or. (isolve==SOLVE_ELSI)
444 . .or. (isolve==SOLVE_CHESS)445 . .or. (isolve==SOLVE_CHESS)
445 . .or. (isolve==SOLVE_TRANSI)) then446 . .or. (isolve==SOLVE_TRANSI)) then
446 if (cml_p) then447 if (cml_p) then
447448
=== added directory 'Tests/si-quantum-dot'
=== added file 'Tests/si-quantum-dot/coords.Si987H372.fdf'
--- Tests/si-quantum-dot/coords.Si987H372.fdf 1970-01-01 00:00:00 +0000
+++ Tests/si-quantum-dot/coords.Si987H372.fdf 2019-01-16 11:00:58 +0000
@@ -0,0 +1,1368 @@
1number-of-atoms 1359
2number-of-species 2
3#
4# Si987H372 from CSIRO Silicon Quantum Dot dataset
5# https://doi.org/10.4225/08/5721BB609EDB0
6#
7atomic-coordinates-format Ang
8%block atomic-coordinates-and-atomic-species
9-12.27486761 -6.80192861 -4.06859976 1 # Si
10-12.25680064 -9.54338724 -1.35303719 1 # Si
11-15.00302593 -6.81988895 -1.33325565 1 # Si
12-13.61801556 -8.18242131 -0.02610591 1 # Si
13-12.27960089 -6.82966552 1.35920843 1 # Si
14-12.28371979 -9.54137501 4.12509126 1 # Si
15-13.62454044 -8.34100436 5.61139268 1 # Si
16-12.30649445 -6.84800235 6.84800235 1 # Si
17-12.25680064 -1.35303719 -9.54338724 1 # Si
18-12.27486761 -4.06859976 -6.80192861 1 # Si
19-15.00302593 -1.33325565 -6.81988895 1 # Si
20-13.66848041 -2.72110232 -5.45900176 1 # Si
21-13.66848041 -5.45900176 -2.72110232 1 # Si
22-16.37023166 -2.73136988 -2.73136988 1 # Si
23-15.01032944 -4.07390228 -4.07390228 1 # Si
24-12.30121531 -1.35797308 -4.09496434 1 # Si
25-12.30121531 -4.09496434 -1.35797308 1 # Si
26-15.04397328 -1.36373167 -1.36373167 1 # Si
27-16.34134070 -5.63195305 0.16771494 1 # Si
28-13.68153993 -2.73058729 0.00300340 1 # Si
29-13.61156377 -5.42510828 2.69385812 1 # Si
30-16.34634363 -2.90187977 2.90187977 1 # Si
31-15.03413728 -4.10489826 1.37590571 1 # Si
32-12.30671934 -1.37011664 1.37011664 1 # Si
33-12.27225867 -4.09402380 4.09402380 1 # Si
34-15.03413728 -1.37590571 4.10489826 1 # Si
35-13.61156377 -2.69385812 5.42510828 1 # Si
36-13.62454044 -5.61139268 8.34100436 1 # Si
37-12.27960089 -1.35920843 6.82966552 1 # Si
38-12.28371979 -4.12509126 9.54137501 1 # Si
39-13.61801556 -0.02610591 -8.18242131 1 # Si
40-12.28371979 4.12509126 -9.54137501 1 # Si
41-12.27960089 1.35920843 -6.82966552 1 # Si
42-16.34134070 0.16771494 -5.63195305 1 # Si
43-13.61156377 2.69385812 -5.42510828 1 # Si
44-13.68153993 0.00300340 -2.73058729 1 # Si
45-16.34634363 2.90187977 -2.90187977 1 # Si
46-15.03413728 1.37590571 -4.10489826 1 # Si
47-12.27225867 4.09402380 -4.09402380 1 # Si
48-12.30671934 1.37011664 -1.37011664 1 # Si
49-15.03413728 4.10489826 -1.37590571 1 # Si
50-16.37821316 0.00000000 0.00000000 1 # Si
51-13.68153993 2.73058729 -0.00300340 1 # Si
52-13.68153993 -0.00300340 2.73058729 1 # Si
53-16.37023166 2.73136988 2.73136988 1 # Si
54-15.04397328 1.36373167 1.36373167 1 # Si
55-12.30121531 4.09496434 1.35797308 1 # Si
56-12.30121531 1.35797308 4.09496434 1 # Si
57-15.01032944 4.07390228 4.07390228 1 # Si
58-16.34134070 -0.16771494 5.63195305 1 # Si
59-13.66848041 2.72110232 5.45900176 1 # Si
60-13.61801556 0.02610591 8.18242131 1 # Si
61-15.00302593 1.33325565 6.81988895 1 # Si
62-12.27486761 4.06859976 6.80192861 1 # Si
63-12.25680064 1.35303719 9.54338724 1 # Si
64-13.62454044 5.61139268 -8.34100436 1 # Si
65-12.30649445 6.84800235 -6.84800235 1 # Si
66-13.62454044 8.34100436 -5.61139268 1 # Si
67-13.61156377 5.42510828 -2.69385812 1 # Si
68-12.28371979 9.54137501 -4.12509126 1 # Si
69-12.27960089 6.82966552 -1.35920843 1 # Si
70-16.34134070 5.63195305 -0.16771494 1 # Si
71-13.61801556 8.18242131 0.02610591 1 # Si
72-13.66848041 5.45900176 2.72110232 1 # Si
73-15.00302593 6.81988895 1.33325565 1 # Si
74-12.25680064 9.54338724 1.35303719 1 # Si
75-12.27486761 6.80192861 4.06859976 1 # Si
76-6.80192861 -12.27486761 -4.06859976 1 # Si
77-6.81988895 -15.00302593 -1.33325565 1 # Si
78-9.54338724 -12.25680064 -1.35303719 1 # Si
79-8.18242131 -13.61801556 -0.02610591 1 # Si
80-6.82966552 -12.27960089 1.35920843 1 # Si
81-9.54137501 -12.28371979 4.12509126 1 # Si
82-8.34100436 -13.62454044 5.61139268 1 # Si
83-6.84800235 -12.30649445 6.84800235 1 # Si
84-6.80203106 -6.80203106 -9.53854106 1 # Si
85-6.80203106 -9.53854106 -6.80203106 1 # Si
86-9.53854106 -6.80203106 -6.80203106 1 # Si
87-8.18684496 -8.18684496 -5.46061302 1 # Si
88-8.18958251 -10.91859845 -2.73303284 1 # Si
89-10.91859845 -8.18958251 -2.73303284 1 # Si
90-9.53373688 -9.53373688 -4.08178398 1 # Si
91-6.82463094 -6.82463094 -4.09496380 1 # Si
92-6.82852418 -9.55456223 -1.36492084 1 # Si
93-9.55456223 -6.82852418 -1.36492084 1 # Si
94-10.90149235 -10.90149235 -0.01304838 1 # Si
95-8.19611273 -8.19611273 0.01202455 1 # Si
96-8.19260795 -10.93410102 2.74437054 1 # Si
97-10.93410102 -8.19260795 2.74437054 1 # Si
98-9.56265344 -9.56265344 1.38318968 1 # Si
99-6.82894278 -6.82894278 1.37205363 1 # Si
100-6.82705083 -9.56489675 4.10298675 1 # Si
101-9.56489675 -6.82705083 4.10298675 1 # Si
102-10.90291341 -10.90291341 5.43284542 1 # Si
103-8.19345007 -8.19345007 5.46312108 1 # Si
104-8.17530342 -10.89255734 8.17530342 1 # Si
105-10.89255734 -8.17530342 8.17530342 1 # Si
106-9.55790920 -9.55790920 6.82271744 1 # Si
107-6.82700831 -6.82700831 6.82700831 1 # Si
108-6.82271744 -9.55790920 9.55790920 1 # Si
109-9.55790920 -6.82271744 9.55790920 1 # Si
110-8.17530342 -8.17530342 10.89255734 1 # Si
111-6.84800235 -6.84800235 12.30649445 1 # Si
112-6.81988895 -1.33325565 -15.00302593 1 # Si
113-6.80192861 -4.06859976 -12.27486761 1 # Si
114-9.54338724 -1.35303719 -12.25680064 1 # Si
115-8.18958251 -2.73303284 -10.91859845 1 # Si
116-8.18684496 -5.46061302 -8.18684496 1 # Si
117-10.91859845 -2.73303284 -8.18958251 1 # Si
118-9.53373688 -4.08178398 -9.53373688 1 # Si
119-6.82852418 -1.36492084 -9.55456223 1 # Si
120-6.82463094 -4.09496380 -6.82463094 1 # Si
121-9.55456223 -1.36492084 -6.82852418 1 # Si
122-10.92814020 -5.45537137 -5.45537137 1 # Si
123-8.19145655 -2.72890018 -5.45886981 1 # Si
124-8.19145655 -5.45886981 -2.72890018 1 # Si
125-10.93548204 -2.72665137 -2.72665137 1 # Si
126-9.56309566 -4.09195916 -4.09195916 1 # Si
127-6.82665093 -1.36387577 -4.09489509 1 # Si
128-6.82665093 -4.09489509 -1.36387577 1 # Si
129-9.56805296 -1.36256401 -1.36256401 1 # Si
130-10.91922202 -5.46070202 -0.00117430 1 # Si
131-8.19459074 -2.72767707 -0.00218763 1 # Si
132-8.19802839 -5.46221414 2.73481248 1 # Si
133-10.91829649 -2.72813440 2.72813440 1 # Si
134-9.55550648 -4.09408870 1.36300571 1 # Si
135-6.82823155 -1.36467215 1.36467215 1 # Si
136-6.82965269 -4.09758805 4.09758805 1 # Si
137-9.55550648 -1.36300571 4.09408870 1 # Si
138-10.94234231 -5.47569502 5.47569502 1 # Si
139-8.19802839 -2.73481248 5.46221414 1 # Si
140-8.19345007 -5.46312108 8.19345007 1 # Si
141-10.93410102 -2.74437054 8.19260795 1 # Si
142-9.56489675 -4.10298675 6.82705083 1 # Si
143-6.82894278 -1.37205363 6.82894278 1 # Si
144-6.82705083 -4.10298675 9.56489675 1 # Si
145-9.56265344 -1.38318968 9.56265344 1 # Si
146-10.90291341 -5.43284542 10.90291341 1 # Si
147-8.19260795 -2.74437054 10.93410102 1 # Si
148-8.34100436 -5.61139268 13.62454044 1 # Si
149-9.54137501 -4.12509126 12.28371979 1 # Si
150-6.82966552 -1.35920843 12.27960089 1 # Si
151-8.18242131 -0.02610591 -13.61801556 1 # Si
152-6.82966552 1.35920843 -12.27960089 1 # Si
153-9.54137501 4.12509126 -12.28371979 1 # Si
154-10.90149235 -0.01304838 -10.90149235 1 # Si
155-8.19260795 2.74437054 -10.93410102 1 # Si
156-8.19611273 0.01202455 -8.19611273 1 # Si
157-10.93410102 2.74437054 -8.19260795 1 # Si
158-9.56265344 1.38318968 -9.56265344 1 # Si
159-6.82705083 4.10298675 -9.56489675 1 # Si
160-6.82894278 1.37205363 -6.82894278 1 # Si
161-9.56489675 4.10298675 -6.82705083 1 # Si
162-10.91922202 -0.00117430 -5.46070202 1 # Si
163-8.19802839 2.73481248 -5.46221414 1 # Si
164-8.19459074 -0.00218763 -2.72767707 1 # Si
165-10.91829649 2.72813440 -2.72813440 1 # Si
166-9.55550648 1.36300571 -4.09408870 1 # Si
167-6.82965269 4.09758805 -4.09758805 1 # Si
168-6.82823155 1.36467215 -1.36467215 1 # Si
169-9.55550648 4.09408870 -1.36300571 1 # Si
170-10.94349563 0.00000000 -0.00000000 1 # Si
171-8.19459074 2.72767707 0.00218763 1 # Si
172-8.19459074 0.00218763 2.72767707 1 # Si
173-10.93548204 2.72665137 2.72665137 1 # Si
174-9.56805296 1.36256401 1.36256401 1 # Si
175-6.82665093 4.09489509 1.36387577 1 # Si
176-6.82665093 1.36387577 4.09489509 1 # Si
177-9.56309566 4.09195916 4.09195916 1 # Si
178-10.91922202 0.00117430 5.46070202 1 # Si
179-8.19145655 2.72890018 5.45886981 1 # Si
180-8.19611273 -0.01202455 8.19611273 1 # Si
181-10.91859845 2.73303284 8.18958251 1 # Si
182-9.55456223 1.36492084 6.82852418 1 # Si
183-6.82463094 4.09496380 6.82463094 1 # Si
184-6.82852418 1.36492084 9.55456223 1 # Si
185-9.53373688 4.08178398 9.53373688 1 # Si
186-10.90149235 0.01304838 10.90149235 1 # Si
187-8.18958251 2.73303284 10.91859845 1 # Si
188-8.18242131 0.02610591 13.61801556 1 # Si
189-9.54338724 1.35303719 12.25680064 1 # Si
190-6.80192861 4.06859976 12.27486761 1 # Si
191-6.81988895 1.33325565 15.00302593 1 # Si
192-8.34100436 5.61139268 -13.62454044 1 # Si
193-6.84800235 6.84800235 -12.30649445 1 # Si
194-10.90291341 5.43284542 -10.90291341 1 # Si
195-8.17530342 8.17530342 -10.89255734 1 # Si
196-8.19345007 5.46312108 -8.19345007 1 # Si
197-10.89255734 8.17530342 -8.17530342 1 # Si
198-9.55790920 6.82271744 -9.55790920 1 # Si
199-6.82271744 9.55790920 -9.55790920 1 # Si
200-6.82700831 6.82700831 -6.82700831 1 # Si
201-9.55790920 9.55790920 -6.82271744 1 # Si
202-10.94234231 5.47569502 -5.47569502 1 # Si
203-8.19345007 8.19345007 -5.46312108 1 # Si
204-8.19802839 5.46221414 -2.73481248 1 # Si
205-10.93410102 8.19260795 -2.74437054 1 # Si
206-9.56489675 6.82705083 -4.10298675 1 # Si
207-6.82705083 9.56489675 -4.10298675 1 # Si
208-6.82894278 6.82894278 -1.37205363 1 # Si
209-9.56265344 9.56265344 -1.38318968 1 # Si
210-10.91922202 5.46070202 0.00117430 1 # Si
211-8.19611273 8.19611273 -0.01202455 1 # Si
212-8.19145655 5.45886981 2.72890018 1 # Si
213-10.91859845 8.18958251 2.73303284 1 # Si
214-9.55456223 6.82852418 1.36492084 1 # Si
215-6.82852418 9.55456223 1.36492084 1 # Si
216-6.82463094 6.82463094 4.09496380 1 # Si
217-9.53373688 9.53373688 4.08178398 1 # Si
218-10.92814020 5.45537137 5.45537137 1 # Si
219-8.18684496 8.18684496 5.46061302 1 # Si
220-8.18684496 5.46061302 8.18684496 1 # Si
221-9.53854106 6.80203106 6.80203106 1 # Si
222-6.80203106 9.53854106 6.80203106 1 # Si
223-6.80203106 6.80203106 9.53854106 1 # Si
224-8.17530342 10.89255734 -8.17530342 1 # Si
225-6.84800235 12.30649445 -6.84800235 1 # Si
226-10.90291341 10.90291341 -5.43284542 1 # Si
227-8.34100436 13.62454044 -5.61139268 1 # Si
228-8.19260795 10.93410102 -2.74437054 1 # Si
229-9.54137501 12.28371979 -4.12509126 1 # Si
230-6.82966552 12.27960089 -1.35920843 1 # Si
231-10.90149235 10.90149235 0.01304838 1 # Si
232-8.18242131 13.61801556 0.02610591 1 # Si
233-8.18958251 10.91859845 2.73303284 1 # Si
234-9.54338724 12.25680064 1.35303719 1 # Si
235-6.81988895 15.00302593 1.33325565 1 # Si
236-6.80192861 12.27486761 4.06859976 1 # Si
237-1.35303719 -12.25680064 -9.54338724 1 # Si
238-1.33325565 -15.00302593 -6.81988895 1 # Si
239-4.06859976 -12.27486761 -6.80192861 1 # Si
240-2.72110232 -13.66848041 -5.45900176 1 # Si
241-2.73136988 -16.37023166 -2.73136988 1 # Si
242-5.45900176 -13.66848041 -2.72110232 1 # Si
243-4.07390228 -15.01032944 -4.07390228 1 # Si
244-1.35797308 -12.30121531 -4.09496434 1 # Si
245-1.36373167 -15.04397328 -1.36373167 1 # Si
246-4.09496434 -12.30121531 -1.35797308 1 # Si
247-5.63195305 -16.34134070 0.16771494 1 # Si
248-2.73058729 -13.68153993 0.00300340 1 # Si
249-2.90187977 -16.34634363 2.90187977 1 # Si
250-5.42510828 -13.61156377 2.69385812 1 # Si
251-4.10489826 -15.03413728 1.37590571 1 # Si
252-1.37011664 -12.30671934 1.37011664 1 # Si
253-1.37590571 -15.03413728 4.10489826 1 # Si
254-4.09402380 -12.27225867 4.09402380 1 # Si
255-2.69385812 -13.61156377 5.42510828 1 # Si
256-5.61139268 -13.62454044 8.34100436 1 # Si
257-1.35920843 -12.27960089 6.82966552 1 # Si
258-4.12509126 -12.28371979 9.54137501 1 # Si
259-1.33325565 -6.81988895 -15.00302593 1 # Si
260-1.35303719 -9.54338724 -12.25680064 1 # Si
261-4.06859976 -6.80192861 -12.27486761 1 # Si
262-2.73303284 -8.18958251 -10.91859845 1 # Si
263-2.73303284 -10.91859845 -8.18958251 1 # Si
264-5.46061302 -8.18684496 -8.18684496 1 # Si
265-4.08178398 -9.53373688 -9.53373688 1 # Si
266-1.36492084 -6.82852418 -9.55456223 1 # Si
267-1.36492084 -9.55456223 -6.82852418 1 # Si
268-4.09496380 -6.82463094 -6.82463094 1 # Si
269-5.45537137 -10.92814020 -5.45537137 1 # Si
270-2.72890018 -8.19145655 -5.45886981 1 # Si
271-2.72665137 -10.93548204 -2.72665137 1 # Si
272-5.45886981 -8.19145655 -2.72890018 1 # Si
273-4.09195916 -9.56309566 -4.09195916 1 # Si
274-1.36387577 -6.82665093 -4.09489509 1 # Si
275-1.36256401 -9.56805296 -1.36256401 1 # Si
276-4.09489509 -6.82665093 -1.36387577 1 # Si
277-5.46070202 -10.91922202 -0.00117430 1 # Si
278-2.72767707 -8.19459074 -0.00218763 1 # Si
279-2.72813440 -10.91829649 2.72813440 1 # Si
280-5.46221414 -8.19802839 2.73481248 1 # Si
281-4.09408870 -9.55550648 1.36300571 1 # Si
282-1.36467215 -6.82823155 1.36467215 1 # Si
283-1.36300571 -9.55550648 4.09408870 1 # Si
284-4.09758805 -6.82965269 4.09758805 1 # Si
285-5.47569502 -10.94234231 5.47569502 1 # Si
286-2.73481248 -8.19802839 5.46221414 1 # Si
287-2.74437054 -10.93410102 8.19260795 1 # Si
288-5.46312108 -8.19345007 8.19345007 1 # Si
289-4.10298675 -9.56489675 6.82705083 1 # Si
290-1.37205363 -6.82894278 6.82894278 1 # Si
291-1.38318968 -9.56265344 9.56265344 1 # Si
292-4.10298675 -6.82705083 9.56489675 1 # Si
293-5.43284542 -10.90291341 10.90291341 1 # Si
294-2.74437054 -8.19260795 10.93410102 1 # Si
295-5.61139268 -8.34100436 13.62454044 1 # Si
296-4.12509126 -9.54137501 12.28371979 1 # Si
297-1.35920843 -6.82966552 12.27960089 1 # Si
298-2.73136988 -2.73136988 -16.37023166 1 # Si
299-2.72110232 -5.45900176 -13.66848041 1 # Si
300-5.45900176 -2.72110232 -13.66848041 1 # Si
301-4.07390228 -4.07390228 -15.01032944 1 # Si
302-1.36373167 -1.36373167 -15.04397328 1 # Si
303-1.35797308 -4.09496434 -12.30121531 1 # Si
304-4.09496434 -1.35797308 -12.30121531 1 # Si
305-5.45537137 -5.45537137 -10.92814020 1 # Si
306-2.72665137 -2.72665137 -10.93548204 1 # Si
307-2.72890018 -5.45886981 -8.19145655 1 # Si
308-5.45886981 -2.72890018 -8.19145655 1 # Si
309-4.09195916 -4.09195916 -9.56309566 1 # Si
310-1.36256401 -1.36256401 -9.56805296 1 # Si
311-1.36387577 -4.09489509 -6.82665093 1 # Si
312-4.09489509 -1.36387577 -6.82665093 1 # Si
313-5.45846237 -5.45846237 -5.45846237 1 # Si
314-2.72992539 -2.72992539 -5.46104379 1 # Si
315-2.72992539 -5.46104379 -2.72992539 1 # Si
316-5.46104379 -2.72992539 -2.72992539 1 # Si
317-4.09449230 -4.09449230 -4.09449230 1 # Si
318-1.36499559 -1.36499559 -4.09571959 1 # Si
319-1.36499559 -4.09571959 -1.36499559 1 # Si
320-4.09571959 -1.36499559 -1.36499559 1 # Si
321-5.46373779 -5.46373779 0.00319897 1 # Si
322-2.73149619 -2.73149619 0.00070150 1 # Si
323-2.73166810 -5.46412728 2.73166810 1 # Si
324-5.46412728 -2.73166810 2.73166810 1 # Si
325-4.09725659 -4.09725659 1.36658639 1 # Si
326-1.36585704 -1.36585704 1.36585704 1 # Si
327-1.36658639 -4.09725659 4.09725659 1 # Si
328-4.09725659 -1.36658639 4.09725659 1 # Si
329-5.46236779 -5.46236779 5.46236779 1 # Si
330-2.73166810 -2.73166810 5.46412728 1 # Si
331-2.73481248 -5.46221414 8.19802839 1 # Si
332-5.46221414 -2.73481248 8.19802839 1 # Si
333-4.09758805 -4.09758805 6.82965269 1 # Si
334-1.36467215 -1.36467215 6.82823155 1 # Si
335-1.36300571 -4.09408870 9.55550648 1 # Si
336-4.09408870 -1.36300571 9.55550648 1 # Si
337-5.47569502 -5.47569502 10.94234231 1 # Si
338-2.72813440 -2.72813440 10.91829649 1 # Si
339-2.69385812 -5.42510828 13.61156377 1 # Si
340-5.42510828 -2.69385812 13.61156377 1 # Si
341-4.09402380 -4.09402380 12.27225867 1 # Si
342-1.37011664 -1.37011664 12.30671934 1 # Si
343-1.37590571 -4.10489826 15.03413728 1 # Si
344-4.10489826 -1.37590571 15.03413728 1 # Si
345-2.90187977 -2.90187977 16.34634363 1 # Si
346-5.63195305 0.16771494 -16.34134070 1 # Si
347-2.90187977 2.90187977 -16.34634363 1 # Si
348-2.73058729 0.00300340 -13.68153993 1 # Si
349-5.42510828 2.69385812 -13.61156377 1 # Si
350-4.10489826 1.37590571 -15.03413728 1 # Si
351-1.37590571 4.10489826 -15.03413728 1 # Si
352-1.37011664 1.37011664 -12.30671934 1 # Si
353-4.09402380 4.09402380 -12.27225867 1 # Si
354-5.46070202 -0.00117430 -10.91922202 1 # Si
355-2.72813440 2.72813440 -10.91829649 1 # Si
356-2.72767707 -0.00218763 -8.19459074 1 # Si
357-5.46221414 2.73481248 -8.19802839 1 # Si
358-4.09408870 1.36300571 -9.55550648 1 # Si
359-1.36300571 4.09408870 -9.55550648 1 # Si
360-1.36467215 1.36467215 -6.82823155 1 # Si
361-4.09758805 4.09758805 -6.82965269 1 # Si
362-5.46373779 0.00319897 -5.46373779 1 # Si
363-2.73166810 2.73166810 -5.46412728 1 # Si
364-2.73149619 0.00070150 -2.73149619 1 # Si
365-5.46412728 2.73166810 -2.73166810 1 # Si
366-4.09725659 1.36658639 -4.09725659 1 # Si
367-1.36658639 4.09725659 -4.09725659 1 # Si
368-1.36585704 1.36585704 -1.36585704 1 # Si
369-4.09725659 4.09725659 -1.36658639 1 # Si
370-5.46151579 -0.00000000 0.00000000 1 # Si
371-2.73149619 2.73149619 -0.00070150 1 # Si
372-2.73149619 -0.00070150 2.73149619 1 # Si
373-5.46104379 2.72992539 2.72992539 1 # Si
374-4.09571959 1.36499559 1.36499559 1 # Si
375-1.36499559 4.09571959 1.36499559 1 # Si
376-1.36499559 1.36499559 4.09571959 1 # Si
377-4.09449230 4.09449230 4.09449230 1 # Si
378-5.46373779 -0.00319897 5.46373779 1 # Si
379-2.72992539 2.72992539 5.46104379 1 # Si
380-2.72767707 0.00218763 8.19459074 1 # Si
381-5.45886981 2.72890018 8.19145655 1 # Si
382-4.09489509 1.36387577 6.82665093 1 # Si
383-1.36387577 4.09489509 6.82665093 1 # Si
384-1.36256401 1.36256401 9.56805296 1 # Si
385-4.09195916 4.09195916 9.56309566 1 # Si
386-5.46070202 0.00117430 10.91922202 1 # Si
387-2.72665137 2.72665137 10.93548204 1 # Si
388-2.73058729 -0.00300340 13.68153993 1 # Si
389-5.45900176 2.72110232 13.66848041 1 # Si
390-4.09496434 1.35797308 12.30121531 1 # Si
391-1.35797308 4.09496434 12.30121531 1 # Si
392-1.36373167 1.36373167 15.04397328 1 # Si
393-4.07390228 4.07390228 15.01032944 1 # Si
394-5.63195305 -0.16771494 16.34134070 1 # Si
395-2.73136988 2.73136988 16.37023166 1 # Si
396-2.69385812 5.42510828 -13.61156377 1 # Si
397-5.61139268 8.34100436 -13.62454044 1 # Si
398-1.35920843 6.82966552 -12.27960089 1 # Si
399-4.12509126 9.54137501 -12.28371979 1 # Si
400-5.47569502 5.47569502 -10.94234231 1 # Si
401-2.74437054 8.19260795 -10.93410102 1 # Si
402-2.73481248 5.46221414 -8.19802839 1 # Si
403-5.46312108 8.19345007 -8.19345007 1 # Si
404-4.10298675 6.82705083 -9.56489675 1 # Si
405-1.38318968 9.56265344 -9.56265344 1 # Si
406-1.37205363 6.82894278 -6.82894278 1 # Si
407-4.10298675 9.56489675 -6.82705083 1 # Si
408-5.46236779 5.46236779 -5.46236779 1 # Si
409-2.73481248 8.19802839 -5.46221414 1 # Si
410-2.73166810 5.46412728 -2.73166810 1 # Si
411-5.46221414 8.19802839 -2.73481248 1 # Si
412-4.09758805 6.82965269 -4.09758805 1 # Si
413-1.36300571 9.55550648 -4.09408870 1 # Si
414-1.36467215 6.82823155 -1.36467215 1 # Si
415-4.09408870 9.55550648 -1.36300571 1 # Si
416-5.46373779 5.46373779 -0.00319897 1 # Si
417-2.72767707 8.19459074 0.00218763 1 # Si
418-2.72992539 5.46104379 2.72992539 1 # Si
419-5.45886981 8.19145655 2.72890018 1 # Si
420-4.09489509 6.82665093 1.36387577 1 # Si
421-1.36256401 9.56805296 1.36256401 1 # Si
422-1.36387577 6.82665093 4.09489509 1 # Si
423-4.09195916 9.56309566 4.09195916 1 # Si
424-5.45846237 5.45846237 5.45846237 1 # Si
425-2.72890018 8.19145655 5.45886981 1 # Si
426-2.72890018 5.45886981 8.19145655 1 # Si
427-5.46061302 8.18684496 8.18684496 1 # Si
428-4.09496380 6.82463094 6.82463094 1 # Si
429-1.36492084 9.55456223 6.82852418 1 # Si
430-1.36492084 6.82852418 9.55456223 1 # Si
431-4.08178398 9.53373688 9.53373688 1 # Si
432-5.45537137 5.45537137 10.92814020 1 # Si
433-2.73303284 8.18958251 10.91859845 1 # Si
434-2.72110232 5.45900176 13.66848041 1 # Si
435-4.06859976 6.80192861 12.27486761 1 # Si
436-1.35303719 9.54338724 12.25680064 1 # Si
437-1.33325565 6.81988895 15.00302593 1 # Si
438-5.43284542 10.90291341 -10.90291341 1 # Si
439-2.74437054 10.93410102 -8.19260795 1 # Si
440-5.61139268 13.62454044 -8.34100436 1 # Si
441-4.12509126 12.28371979 -9.54137501 1 # Si
442-1.35920843 12.27960089 -6.82966552 1 # Si
443-5.47569502 10.94234231 -5.47569502 1 # Si
444-2.69385812 13.61156377 -5.42510828 1 # Si
445-2.72813440 10.91829649 -2.72813440 1 # Si
446-5.42510828 13.61156377 -2.69385812 1 # Si
447-4.09402380 12.27225867 -4.09402380 1 # Si
448-1.37590571 15.03413728 -4.10489826 1 # Si
449-1.37011664 12.30671934 -1.37011664 1 # Si
450-4.10489826 15.03413728 -1.37590571 1 # Si
451-5.46070202 10.91922202 0.00117430 1 # Si
452-2.73058729 13.68153993 -0.00300340 1 # Si
453-2.72665137 10.93548204 2.72665137 1 # Si
454-5.45900176 13.66848041 2.72110232 1 # Si
455-4.09496434 12.30121531 1.35797308 1 # Si
456-1.36373167 15.04397328 1.36373167 1 # Si
457-1.35797308 12.30121531 4.09496434 1 # Si
458-4.07390228 15.01032944 4.07390228 1 # Si
459-5.45537137 10.92814020 5.45537137 1 # Si
460-2.72110232 13.66848041 5.45900176 1 # Si
461-2.73303284 10.91859845 8.18958251 1 # Si
462-4.06859976 12.27486761 6.80192861 1 # Si
463-1.33325565 15.00302593 6.81988895 1 # Si
464-1.35303719 12.25680064 9.54338724 1 # Si
465-2.90187977 16.34634363 -2.90187977 1 # Si
466-5.63195305 16.34134070 -0.16771494 1 # Si
467-2.73136988 16.37023166 2.73136988 1 # Si
468-0.02610591 -13.61801556 -8.18242131 1 # Si
4694.12509126 -12.28371979 -9.54137501 1 # Si
4701.35920843 -12.27960089 -6.82966552 1 # Si
4710.16771494 -16.34134070 -5.63195305 1 # Si
4722.69385812 -13.61156377 -5.42510828 1 # Si
4732.90187977 -16.34634363 -2.90187977 1 # Si
4740.00300340 -13.68153993 -2.73058729 1 # Si
4751.37590571 -15.03413728 -4.10489826 1 # Si
4764.09402380 -12.27225867 -4.09402380 1 # Si
4774.10489826 -15.03413728 -1.37590571 1 # Si
4781.37011664 -12.30671934 -1.37011664 1 # Si
479-0.00000000 -16.37821316 -0.00000000 1 # Si
4802.73058729 -13.68153993 -0.00300340 1 # Si
4812.73136988 -16.37023166 2.73136988 1 # Si
482-0.00300340 -13.68153993 2.73058729 1 # Si
4831.36373167 -15.04397328 1.36373167 1 # Si
4844.09496434 -12.30121531 1.35797308 1 # Si
4854.07390228 -15.01032944 4.07390228 1 # Si
4861.35797308 -12.30121531 4.09496434 1 # Si
487-0.16771494 -16.34134070 5.63195305 1 # Si
4882.72110232 -13.66848041 5.45900176 1 # Si
4890.02610591 -13.61801556 8.18242131 1 # Si
4901.33325565 -15.00302593 6.81988895 1 # Si
4914.06859976 -12.27486761 6.80192861 1 # Si
4921.35303719 -12.25680064 9.54338724 1 # Si
493-0.02610591 -8.18242131 -13.61801556 1 # Si
4944.12509126 -9.54137501 -12.28371979 1 # Si
4951.35920843 -6.82966552 -12.27960089 1 # Si
496-0.01304838 -10.90149235 -10.90149235 1 # Si
4972.74437054 -8.19260795 -10.93410102 1 # Si
4982.74437054 -10.93410102 -8.19260795 1 # Si
4990.01202455 -8.19611273 -8.19611273 1 # Si
5001.38318968 -9.56265344 -9.56265344 1 # Si
5014.10298675 -6.82705083 -9.56489675 1 # Si
5024.10298675 -9.56489675 -6.82705083 1 # Si
5031.37205363 -6.82894278 -6.82894278 1 # Si
504-0.00117430 -10.91922202 -5.46070202 1 # Si
5052.73481248 -8.19802839 -5.46221414 1 # Si
5062.72813440 -10.91829649 -2.72813440 1 # Si
507-0.00218763 -8.19459074 -2.72767707 1 # Si
5081.36300571 -9.55550648 -4.09408870 1 # Si
5094.09758805 -6.82965269 -4.09758805 1 # Si
5104.09408870 -9.55550648 -1.36300571 1 # Si
5111.36467215 -6.82823155 -1.36467215 1 # Si
5120.00000000 -10.94349563 0.00000000 1 # Si
5132.72767707 -8.19459074 0.00218763 1 # Si
5142.72665137 -10.93548204 2.72665137 1 # Si
5150.00218763 -8.19459074 2.72767707 1 # Si
5161.36256401 -9.56805296 1.36256401 1 # Si
5174.09489509 -6.82665093 1.36387577 1 # Si
5184.09195916 -9.56309566 4.09195916 1 # Si
5191.36387577 -6.82665093 4.09489509 1 # Si
5200.00117430 -10.91922202 5.46070202 1 # Si
5212.72890018 -8.19145655 5.45886981 1 # Si
5222.73303284 -10.91859845 8.18958251 1 # Si
523-0.01202455 -8.19611273 8.19611273 1 # Si
5241.36492084 -9.55456223 6.82852418 1 # Si
5254.09496380 -6.82463094 6.82463094 1 # Si
5264.08178398 -9.53373688 9.53373688 1 # Si
5271.36492084 -6.82852418 9.55456223 1 # Si
5280.01304838 -10.90149235 10.90149235 1 # Si
5292.73303284 -8.18958251 10.91859845 1 # Si
5300.02610591 -8.18242131 13.61801556 1 # Si
5311.35303719 -9.54338724 12.25680064 1 # Si
5324.06859976 -6.80192861 12.27486761 1 # Si
5331.33325565 -6.81988895 15.00302593 1 # Si
5340.16771494 -5.63195305 -16.34134070 1 # Si
5352.90187977 -2.90187977 -16.34634363 1 # Si
5362.69385812 -5.42510828 -13.61156377 1 # Si
5370.00300340 -2.73058729 -13.68153993 1 # Si
5381.37590571 -4.10489826 -15.03413728 1 # Si
5394.10489826 -1.37590571 -15.03413728 1 # Si
5404.09402380 -4.09402380 -12.27225867 1 # Si
5411.37011664 -1.37011664 -12.30671934 1 # Si
542-0.00117430 -5.46070202 -10.91922202 1 # Si
5432.72813440 -2.72813440 -10.91829649 1 # Si
5442.73481248 -5.46221414 -8.19802839 1 # Si
545-0.00218763 -2.72767707 -8.19459074 1 # Si
5461.36300571 -4.09408870 -9.55550648 1 # Si
5474.09408870 -1.36300571 -9.55550648 1 # Si
5484.09758805 -4.09758805 -6.82965269 1 # Si
5491.36467215 -1.36467215 -6.82823155 1 # Si
5500.00319897 -5.46373779 -5.46373779 1 # Si
5512.73166810 -2.73166810 -5.46412728 1 # Si
5522.73166810 -5.46412728 -2.73166810 1 # Si
5530.00070150 -2.73149619 -2.73149619 1 # Si
5541.36658639 -4.09725659 -4.09725659 1 # Si
5554.09725659 -1.36658639 -4.09725659 1 # Si
5564.09725659 -4.09725659 -1.36658639 1 # Si
5571.36585704 -1.36585704 -1.36585704 1 # Si
558-0.00000000 -5.46151579 0.00000000 1 # Si
5592.73149619 -2.73149619 -0.00070150 1 # Si
5602.72992539 -5.46104379 2.72992539 1 # Si
561-0.00070150 -2.73149619 2.73149619 1 # Si
5621.36499559 -4.09571959 1.36499559 1 # Si
5634.09571959 -1.36499559 1.36499559 1 # Si
5644.09449230 -4.09449230 4.09449230 1 # Si
5651.36499559 -1.36499559 4.09571959 1 # Si
566-0.00319897 -5.46373779 5.46373779 1 # Si
5672.72992539 -2.72992539 5.46104379 1 # Si
5682.72890018 -5.45886981 8.19145655 1 # Si
5690.00218763 -2.72767707 8.19459074 1 # Si
5701.36387577 -4.09489509 6.82665093 1 # Si
5714.09489509 -1.36387577 6.82665093 1 # Si
5724.09195916 -4.09195916 9.56309566 1 # Si
5731.36256401 -1.36256401 9.56805296 1 # Si
5740.00117430 -5.46070202 10.91922202 1 # Si
5752.72665137 -2.72665137 10.93548204 1 # Si
5762.72110232 -5.45900176 13.66848041 1 # Si
577-0.00300340 -2.73058729 13.68153993 1 # Si
5781.35797308 -4.09496434 12.30121531 1 # Si
5794.09496434 -1.35797308 12.30121531 1 # Si
5804.07390228 -4.07390228 15.01032944 1 # Si
5811.36373167 -1.36373167 15.04397328 1 # Si
582-0.16771494 -5.63195305 16.34134070 1 # Si
5832.73136988 -2.73136988 16.37023166 1 # Si
584-0.00000000 -0.00000000 -16.37821316 1 # Si
5852.73136988 2.73136988 -16.37023166 1 # Si
5862.73058729 -0.00300340 -13.68153993 1 # Si
587-0.00300340 2.73058729 -13.68153993 1 # Si
5881.36373167 1.36373167 -15.04397328 1 # Si
5894.07390228 4.07390228 -15.01032944 1 # Si
5904.09496434 1.35797308 -12.30121531 1 # Si
5911.35797308 4.09496434 -12.30121531 1 # Si
5920.00000000 0.00000000 -10.94349563 1 # Si
5932.72665137 2.72665137 -10.93548204 1 # Si
5942.72767707 0.00218763 -8.19459074 1 # Si
5950.00218763 2.72767707 -8.19459074 1 # Si
5961.36256401 1.36256401 -9.56805296 1 # Si
5974.09195916 4.09195916 -9.56309566 1 # Si
5984.09489509 1.36387577 -6.82665093 1 # Si
5991.36387577 4.09489509 -6.82665093 1 # Si
600-0.00000000 0.00000000 -5.46151579 1 # Si
6012.72992539 2.72992539 -5.46104379 1 # Si
6022.73149619 -0.00070150 -2.73149619 1 # Si
603-0.00070150 2.73149619 -2.73149619 1 # Si
6041.36499559 1.36499559 -4.09571959 1 # Si
6054.09449230 4.09449230 -4.09449230 1 # Si
6064.09571959 1.36499559 -1.36499559 1 # Si
6071.36499559 4.09571959 -1.36499559 1 # Si
608-0.00000000 0.00000000 -0.00000000 1 # Si
6092.73149619 2.73149619 0.00070150 1 # Si
6102.73149619 0.00070150 2.73149619 1 # Si
6110.00070150 2.73149619 2.73149619 1 # Si
6121.36585704 1.36585704 1.36585704 1 # Si
6134.09725659 4.09725659 1.36658639 1 # Si
6144.09725659 1.36658639 4.09725659 1 # Si
6151.36658639 4.09725659 4.09725659 1 # Si
616-0.00000000 -0.00000000 5.46151579 1 # Si
6172.73166810 2.73166810 5.46412728 1 # Si
6182.72767707 -0.00218763 8.19459074 1 # Si
619-0.00218763 2.72767707 8.19459074 1 # Si
6201.36467215 1.36467215 6.82823155 1 # Si
6214.09758805 4.09758805 6.82965269 1 # Si
6224.09408870 1.36300571 9.55550648 1 # Si
6231.36300571 4.09408870 9.55550648 1 # Si
624-0.00000000 -0.00000000 10.94349563 1 # Si
6252.72813440 2.72813440 10.91829649 1 # Si
6262.73058729 0.00300340 13.68153993 1 # Si
6270.00300340 2.73058729 13.68153993 1 # Si
6281.37011664 1.37011664 12.30671934 1 # Si
6294.09402380 4.09402380 12.27225867 1 # Si
6304.10489826 1.37590571 15.03413728 1 # Si
6311.37590571 4.10489826 15.03413728 1 # Si
632-0.00000000 -0.00000000 16.37821316 1 # Si
6332.90187977 2.90187977 16.34634363 1 # Si
634-0.16771494 5.63195305 -16.34134070 1 # Si
6352.72110232 5.45900176 -13.66848041 1 # Si
6360.02610591 8.18242131 -13.61801556 1 # Si
6371.33325565 6.81988895 -15.00302593 1 # Si
6384.06859976 6.80192861 -12.27486761 1 # Si
6391.35303719 9.54338724 -12.25680064 1 # Si
6400.00117430 5.46070202 -10.91922202 1 # Si
6412.73303284 8.18958251 -10.91859845 1 # Si
6422.72890018 5.45886981 -8.19145655 1 # Si
643-0.01202455 8.19611273 -8.19611273 1 # Si
6441.36492084 6.82852418 -9.55456223 1 # Si
6454.08178398 9.53373688 -9.53373688 1 # Si
6464.09496380 6.82463094 -6.82463094 1 # Si
6471.36492084 9.55456223 -6.82852418 1 # Si
648-0.00319897 5.46373779 -5.46373779 1 # Si
6492.72890018 8.19145655 -5.45886981 1 # Si
6502.72992539 5.46104379 -2.72992539 1 # Si
6510.00218763 8.19459074 -2.72767707 1 # Si
6521.36387577 6.82665093 -4.09489509 1 # Si
6534.09195916 9.56309566 -4.09195916 1 # Si
6544.09489509 6.82665093 -1.36387577 1 # Si
6551.36256401 9.56805296 -1.36256401 1 # Si
656-0.00000000 5.46151579 0.00000000 1 # Si
6572.72767707 8.19459074 -0.00218763 1 # Si
6582.73166810 5.46412728 2.73166810 1 # Si
659-0.00218763 8.19459074 2.72767707 1 # Si
6601.36467215 6.82823155 1.36467215 1 # Si
6614.09408870 9.55550648 1.36300571 1 # Si
6624.09758805 6.82965269 4.09758805 1 # Si
6631.36300571 9.55550648 4.09408870 1 # Si
6640.00319897 5.46373779 5.46373779 1 # Si
6652.73481248 8.19802839 5.46221414 1 # Si
6662.73481248 5.46221414 8.19802839 1 # Si
6670.01202455 8.19611273 8.19611273 1 # Si
6681.37205363 6.82894278 6.82894278 1 # Si
6694.10298675 9.56489675 6.82705083 1 # Si
6704.10298675 6.82705083 9.56489675 1 # Si
6711.38318968 9.56265344 9.56265344 1 # Si
672-0.00117430 5.46070202 10.91922202 1 # Si
6732.74437054 8.19260795 10.93410102 1 # Si
6742.69385812 5.42510828 13.61156377 1 # Si
675-0.02610591 8.18242131 13.61801556 1 # Si
6761.35920843 6.82966552 12.27960089 1 # Si
6774.12509126 9.54137501 12.28371979 1 # Si
6780.16771494 5.63195305 16.34134070 1 # Si
6790.01304838 10.90149235 -10.90149235 1 # Si
6802.73303284 10.91859845 -8.18958251 1 # Si
6810.02610591 13.61801556 -8.18242131 1 # Si
6821.35303719 12.25680064 -9.54338724 1 # Si
6834.06859976 12.27486761 -6.80192861 1 # Si
6841.33325565 15.00302593 -6.81988895 1 # Si
6850.00117430 10.91922202 -5.46070202 1 # Si
6862.72110232 13.66848041 -5.45900176 1 # Si
6872.72665137 10.93548204 -2.72665137 1 # Si
688-0.00300340 13.68153993 -2.73058729 1 # Si
6891.35797308 12.30121531 -4.09496434 1 # Si
6904.07390228 15.01032944 -4.07390228 1 # Si
6914.09496434 12.30121531 -1.35797308 1 # Si
6921.36373167 15.04397328 -1.36373167 1 # Si
693-0.00000000 10.94349563 0.00000000 1 # Si
6942.73058729 13.68153993 0.00300340 1 # Si
6952.72813440 10.91829649 2.72813440 1 # Si
6960.00300340 13.68153993 2.73058729 1 # Si
6971.37011664 12.30671934 1.37011664 1 # Si
6984.10489826 15.03413728 1.37590571 1 # Si
6994.09402380 12.27225867 4.09402380 1 # Si
7001.37590571 15.03413728 4.10489826 1 # Si
701-0.00117430 10.91922202 5.46070202 1 # Si
7022.69385812 13.61156377 5.42510828 1 # Si
7032.74437054 10.93410102 8.19260795 1 # Si
704-0.02610591 13.61801556 8.18242131 1 # Si
7051.35920843 12.27960089 6.82966552 1 # Si
7064.12509126 12.28371979 9.54137501 1 # Si
707-0.01304838 10.90149235 10.90149235 1 # Si
708-0.16771494 16.34134070 -5.63195305 1 # Si
7092.73136988 16.37023166 -2.73136988 1 # Si
7100.00000000 16.37821316 -0.00000000 1 # Si
7112.90187977 16.34634363 2.90187977 1 # Si
7120.16771494 16.34134070 5.63195305 1 # Si
7135.61139268 -13.62454044 -8.34100436 1 # Si
7146.84800235 -12.30649445 -6.84800235 1 # Si
7158.34100436 -13.62454044 -5.61139268 1 # Si
7165.42510828 -13.61156377 -2.69385812 1 # Si
7179.54137501 -12.28371979 -4.12509126 1 # Si
7186.82966552 -12.27960089 -1.35920843 1 # Si
7195.63195305 -16.34134070 -0.16771494 1 # Si
7208.18242131 -13.61801556 0.02610591 1 # Si
7215.45900176 -13.66848041 2.72110232 1 # Si
7226.81988895 -15.00302593 1.33325565 1 # Si
7239.54338724 -12.25680064 1.35303719 1 # Si
7246.80192861 -12.27486761 4.06859976 1 # Si
7255.61139268 -8.34100436 -13.62454044 1 # Si
7266.84800235 -6.84800235 -12.30649445 1 # Si
7275.43284542 -10.90291341 -10.90291341 1 # Si
7288.17530342 -8.17530342 -10.89255734 1 # Si
7298.17530342 -10.89255734 -8.17530342 1 # Si
7305.46312108 -8.19345007 -8.19345007 1 # Si
7316.82271744 -9.55790920 -9.55790920 1 # Si
7329.55790920 -6.82271744 -9.55790920 1 # Si
7339.55790920 -9.55790920 -6.82271744 1 # Si
7346.82700831 -6.82700831 -6.82700831 1 # Si
7355.47569502 -10.94234231 -5.47569502 1 # Si
7368.19345007 -8.19345007 -5.46312108 1 # Si
7378.19260795 -10.93410102 -2.74437054 1 # Si
7385.46221414 -8.19802839 -2.73481248 1 # Si
7396.82705083 -9.56489675 -4.10298675 1 # Si
7409.56489675 -6.82705083 -4.10298675 1 # Si
7419.56265344 -9.56265344 -1.38318968 1 # Si
7426.82894278 -6.82894278 -1.37205363 1 # Si
7435.46070202 -10.91922202 0.00117430 1 # Si
7448.19611273 -8.19611273 -0.01202455 1 # Si
7458.18958251 -10.91859845 2.73303284 1 # Si
7465.45886981 -8.19145655 2.72890018 1 # Si
7476.82852418 -9.55456223 1.36492084 1 # Si
7489.55456223 -6.82852418 1.36492084 1 # Si
7499.53373688 -9.53373688 4.08178398 1 # Si
7506.82463094 -6.82463094 4.09496380 1 # Si
7515.45537137 -10.92814020 5.45537137 1 # Si
7528.18684496 -8.18684496 5.46061302 1 # Si
7535.46061302 -8.18684496 8.18684496 1 # Si
7546.80203106 -9.53854106 6.80203106 1 # Si
7559.53854106 -6.80203106 6.80203106 1 # Si
7566.80203106 -6.80203106 9.53854106 1 # Si
7578.34100436 -5.61139268 -13.62454044 1 # Si
7585.42510828 -2.69385812 -13.61156377 1 # Si
7599.54137501 -4.12509126 -12.28371979 1 # Si
7606.82966552 -1.35920843 -12.27960089 1 # Si
7615.47569502 -5.47569502 -10.94234231 1 # Si
7628.19260795 -2.74437054 -10.93410102 1 # Si
7638.19345007 -5.46312108 -8.19345007 1 # Si
7645.46221414 -2.73481248 -8.19802839 1 # Si
7656.82705083 -4.10298675 -9.56489675 1 # Si
7669.56265344 -1.38318968 -9.56265344 1 # Si
7679.56489675 -4.10298675 -6.82705083 1 # Si
7686.82894278 -1.37205363 -6.82894278 1 # Si
7695.46236779 -5.46236779 -5.46236779 1 # Si
7708.19802839 -2.73481248 -5.46221414 1 # Si
7718.19802839 -5.46221414 -2.73481248 1 # Si
7725.46412728 -2.73166810 -2.73166810 1 # Si
7736.82965269 -4.09758805 -4.09758805 1 # Si
7749.55550648 -1.36300571 -4.09408870 1 # Si
7759.55550648 -4.09408870 -1.36300571 1 # Si
7766.82823155 -1.36467215 -1.36467215 1 # Si
7775.46373779 -5.46373779 -0.00319897 1 # Si
7788.19459074 -2.72767707 0.00218763 1 # Si
7798.19145655 -5.45886981 2.72890018 1 # Si
7805.46104379 -2.72992539 2.72992539 1 # Si
7816.82665093 -4.09489509 1.36387577 1 # Si
7829.56805296 -1.36256401 1.36256401 1 # Si
7839.56309566 -4.09195916 4.09195916 1 # Si
7846.82665093 -1.36387577 4.09489509 1 # Si
7855.45846237 -5.45846237 5.45846237 1 # Si
7868.19145655 -2.72890018 5.45886981 1 # Si
7878.18684496 -5.46061302 8.18684496 1 # Si
7885.45886981 -2.72890018 8.19145655 1 # Si
7896.82463094 -4.09496380 6.82463094 1 # Si
7909.55456223 -1.36492084 6.82852418 1 # Si
7919.53373688 -4.08178398 9.53373688 1 # Si
7926.82852418 -1.36492084 9.55456223 1 # Si
7935.45537137 -5.45537137 10.92814020 1 # Si
7948.18958251 -2.73303284 10.91859845 1 # Si
7955.45900176 -2.72110232 13.66848041 1 # Si
7966.80192861 -4.06859976 12.27486761 1 # Si
7979.54338724 -1.35303719 12.25680064 1 # Si
7986.81988895 -1.33325565 15.00302593 1 # Si
7995.63195305 -0.16771494 -16.34134070 1 # Si
8008.18242131 0.02610591 -13.61801556 1 # Si
8015.45900176 2.72110232 -13.66848041 1 # Si
8026.81988895 1.33325565 -15.00302593 1 # Si
8039.54338724 1.35303719 -12.25680064 1 # Si
8046.80192861 4.06859976 -12.27486761 1 # Si
8055.46070202 0.00117430 -10.91922202 1 # Si
8068.18958251 2.73303284 -10.91859845 1 # Si
8078.19611273 -0.01202455 -8.19611273 1 # Si
8085.45886981 2.72890018 -8.19145655 1 # Si
8096.82852418 1.36492084 -9.55456223 1 # Si
8109.53373688 4.08178398 -9.53373688 1 # Si
8119.55456223 1.36492084 -6.82852418 1 # Si
8126.82463094 4.09496380 -6.82463094 1 # Si
8135.46373779 -0.00319897 -5.46373779 1 # Si
8148.19145655 2.72890018 -5.45886981 1 # Si
8158.19459074 0.00218763 -2.72767707 1 # Si
8165.46104379 2.72992539 -2.72992539 1 # Si
8176.82665093 1.36387577 -4.09489509 1 # Si
8189.56309566 4.09195916 -4.09195916 1 # Si
8199.56805296 1.36256401 -1.36256401 1 # Si
8206.82665093 4.09489509 -1.36387577 1 # Si
8215.46151579 -0.00000000 0.00000000 1 # Si
8228.19459074 2.72767707 -0.00218763 1 # Si
8238.19459074 -0.00218763 2.72767707 1 # Si
8245.46412728 2.73166810 2.73166810 1 # Si
8256.82823155 1.36467215 1.36467215 1 # Si
8269.55550648 4.09408870 1.36300571 1 # Si
8279.55550648 1.36300571 4.09408870 1 # Si
8286.82965269 4.09758805 4.09758805 1 # Si
8295.46373779 0.00319897 5.46373779 1 # Si
8308.19802839 2.73481248 5.46221414 1 # Si
8318.19611273 0.01202455 8.19611273 1 # Si
8325.46221414 2.73481248 8.19802839 1 # Si
8336.82894278 1.37205363 6.82894278 1 # Si
8349.56489675 4.10298675 6.82705083 1 # Si
8359.56265344 1.38318968 9.56265344 1 # Si
8366.82705083 4.10298675 9.56489675 1 # Si
8375.46070202 -0.00117430 10.91922202 1 # Si
8388.19260795 2.74437054 10.93410102 1 # Si
8398.18242131 -0.02610591 13.61801556 1 # Si
8405.42510828 2.69385812 13.61156377 1 # Si
8416.82966552 1.35920843 12.27960089 1 # Si
8429.54137501 4.12509126 12.28371979 1 # Si
8435.63195305 0.16771494 16.34134070 1 # Si
8445.45537137 5.45537137 -10.92814020 1 # Si
8458.18684496 5.46061302 -8.18684496 1 # Si
8465.46061302 8.18684496 -8.18684496 1 # Si
8476.80203106 6.80203106 -9.53854106 1 # Si
8489.53854106 6.80203106 -6.80203106 1 # Si
8496.80203106 9.53854106 -6.80203106 1 # Si
8505.45846237 5.45846237 -5.45846237 1 # Si
8518.18684496 8.18684496 -5.46061302 1 # Si
8528.19145655 5.45886981 -2.72890018 1 # Si
8535.45886981 8.19145655 -2.72890018 1 # Si
8546.82463094 6.82463094 -4.09496380 1 # Si
8559.53373688 9.53373688 -4.08178398 1 # Si
8569.55456223 6.82852418 -1.36492084 1 # Si
8576.82852418 9.55456223 -1.36492084 1 # Si
8585.46373779 5.46373779 0.00319897 1 # Si
8598.19611273 8.19611273 0.01202455 1 # Si
8608.19802839 5.46221414 2.73481248 1 # Si
8615.46221414 8.19802839 2.73481248 1 # Si
8626.82894278 6.82894278 1.37205363 1 # Si
8639.56265344 9.56265344 1.38318968 1 # Si
8649.56489675 6.82705083 4.10298675 1 # Si
8656.82705083 9.56489675 4.10298675 1 # Si
8665.46236779 5.46236779 5.46236779 1 # Si
8678.19345007 8.19345007 5.46312108 1 # Si
8688.19345007 5.46312108 8.19345007 1 # Si
8695.46312108 8.19345007 8.19345007 1 # Si
8706.82700831 6.82700831 6.82700831 1 # Si
8719.55790920 9.55790920 6.82271744 1 # Si
8729.55790920 6.82271744 9.55790920 1 # Si
8736.82271744 9.55790920 9.55790920 1 # Si
8745.47569502 5.47569502 10.94234231 1 # Si
8758.17530342 8.17530342 10.89255734 1 # Si
8768.34100436 5.61139268 13.62454044 1 # Si
8775.61139268 8.34100436 13.62454044 1 # Si
8786.84800235 6.84800235 12.30649445 1 # Si
8795.45537137 10.92814020 -5.45537137 1 # Si
8808.18958251 10.91859845 -2.73303284 1 # Si
8815.45900176 13.66848041 -2.72110232 1 # Si
8826.80192861 12.27486761 -4.06859976 1 # Si
8839.54338724 12.25680064 -1.35303719 1 # Si
8846.81988895 15.00302593 -1.33325565 1 # Si
8855.46070202 10.91922202 -0.00117430 1 # Si
8868.18242131 13.61801556 -0.02610591 1 # Si
8878.19260795 10.93410102 2.74437054 1 # Si
8885.42510828 13.61156377 2.69385812 1 # Si
8896.82966552 12.27960089 1.35920843 1 # Si
8909.54137501 12.28371979 4.12509126 1 # Si
8915.47569502 10.94234231 5.47569502 1 # Si
8928.34100436 13.62454044 5.61139268 1 # Si
8938.17530342 10.89255734 8.17530342 1 # Si
8945.61139268 13.62454044 8.34100436 1 # Si
8956.84800235 12.30649445 6.84800235 1 # Si
8965.43284542 10.90291341 10.90291341 1 # Si
8975.63195305 16.34134070 0.16771494 1 # Si
89810.89255734 -8.17530342 -8.17530342 1 # Si
89912.30649445 -6.84800235 -6.84800235 1 # Si
90010.90291341 -10.90291341 -5.43284542 1 # Si
90113.62454044 -8.34100436 -5.61139268 1 # Si
90210.93410102 -8.19260795 -2.74437054 1 # Si
90312.28371979 -9.54137501 -4.12509126 1 # Si
90412.27960089 -6.82966552 -1.35920843 1 # Si
90510.90149235 -10.90149235 0.01304838 1 # Si
90613.61801556 -8.18242131 0.02610591 1 # Si
90710.91859845 -8.18958251 2.73303284 1 # Si
90812.25680064 -9.54338724 1.35303719 1 # Si
90915.00302593 -6.81988895 1.33325565 1 # Si
91012.27486761 -6.80192861 4.06859976 1 # Si
91110.90291341 -5.43284542 -10.90291341 1 # Si
91213.62454044 -5.61139268 -8.34100436 1 # Si
91310.93410102 -2.74437054 -8.19260795 1 # Si
91412.28371979 -4.12509126 -9.54137501 1 # Si
91512.27960089 -1.35920843 -6.82966552 1 # Si
91610.94234231 -5.47569502 -5.47569502 1 # Si
91713.61156377 -2.69385812 -5.42510828 1 # Si
91813.61156377 -5.42510828 -2.69385812 1 # Si
91910.91829649 -2.72813440 -2.72813440 1 # Si
92012.27225867 -4.09402380 -4.09402380 1 # Si
92115.03413728 -1.37590571 -4.10489826 1 # Si
92215.03413728 -4.10489826 -1.37590571 1 # Si
92312.30671934 -1.37011664 -1.37011664 1 # Si
92410.91922202 -5.46070202 0.00117430 1 # Si
92513.68153993 -2.73058729 -0.00300340 1 # Si
92613.66848041 -5.45900176 2.72110232 1 # Si
92710.93548204 -2.72665137 2.72665137 1 # Si
92812.30121531 -4.09496434 1.35797308 1 # Si
92915.04397328 -1.36373167 1.36373167 1 # Si
93015.01032944 -4.07390228 4.07390228 1 # Si
93112.30121531 -1.35797308 4.09496434 1 # Si
93210.92814020 -5.45537137 5.45537137 1 # Si
93313.66848041 -2.72110232 5.45900176 1 # Si
93410.91859845 -2.73303284 8.18958251 1 # Si
93512.27486761 -4.06859976 6.80192861 1 # Si
93615.00302593 -1.33325565 6.81988895 1 # Si
93712.25680064 -1.35303719 9.54338724 1 # Si
93810.90149235 0.01304838 -10.90149235 1 # Si
93913.61801556 0.02610591 -8.18242131 1 # Si
94010.91859845 2.73303284 -8.18958251 1 # Si
94112.25680064 1.35303719 -9.54338724 1 # Si
94215.00302593 1.33325565 -6.81988895 1 # Si
94312.27486761 4.06859976 -6.80192861 1 # Si
94410.91922202 0.00117430 -5.46070202 1 # Si
94513.66848041 2.72110232 -5.45900176 1 # Si
94613.68153993 -0.00300340 -2.73058729 1 # Si
94710.93548204 2.72665137 -2.72665137 1 # Si
94812.30121531 1.35797308 -4.09496434 1 # Si
94915.01032944 4.07390228 -4.07390228 1 # Si
95015.04397328 1.36373167 -1.36373167 1 # Si
95112.30121531 4.09496434 -1.35797308 1 # Si
95210.94349563 0.00000000 -0.00000000 1 # Si
95313.68153993 2.73058729 0.00300340 1 # Si
95413.68153993 0.00300340 2.73058729 1 # Si
95510.91829649 2.72813440 2.72813440 1 # Si
95612.30671934 1.37011664 1.37011664 1 # Si
95715.03413728 4.10489826 1.37590571 1 # Si
95815.03413728 1.37590571 4.10489826 1 # Si
95912.27225867 4.09402380 4.09402380 1 # Si
96010.91922202 -0.00117430 5.46070202 1 # Si
96113.61156377 2.69385812 5.42510828 1 # Si
96213.61801556 -0.02610591 8.18242131 1 # Si
96310.93410102 2.74437054 8.19260795 1 # Si
96412.27960089 1.35920843 6.82966552 1 # Si
96512.28371979 4.12509126 9.54137501 1 # Si
96610.90149235 -0.01304838 10.90149235 1 # Si
96710.92814020 5.45537137 -5.45537137 1 # Si
96813.66848041 5.45900176 -2.72110232 1 # Si
96910.91859845 8.18958251 -2.73303284 1 # Si
97012.27486761 6.80192861 -4.06859976 1 # Si
97115.00302593 6.81988895 -1.33325565 1 # Si
97212.25680064 9.54338724 -1.35303719 1 # Si
97310.91922202 5.46070202 -0.00117430 1 # Si
97413.61801556 8.18242131 -0.02610591 1 # Si
97513.61156377 5.42510828 2.69385812 1 # Si
97610.93410102 8.19260795 2.74437054 1 # Si
97712.27960089 6.82966552 1.35920843 1 # Si
97812.28371979 9.54137501 4.12509126 1 # Si
97910.94234231 5.47569502 5.47569502 1 # Si
98013.62454044 8.34100436 5.61139268 1 # Si
98113.62454044 5.61139268 8.34100436 1 # Si
98210.89255734 8.17530342 8.17530342 1 # Si
98312.30649445 6.84800235 6.84800235 1 # Si
98410.90291341 5.43284542 10.90291341 1 # Si
98510.90149235 10.90149235 -0.01304838 1 # Si
98610.90291341 10.90291341 5.43284542 1 # Si
98716.34634363 -2.90187977 -2.90187977 1 # Si
98816.34134070 -5.63195305 -0.16771494 1 # Si
98916.37023166 -2.73136988 2.73136988 1 # Si
99016.34134070 -0.16771494 -5.63195305 1 # Si
99116.37023166 2.73136988 -2.73136988 1 # Si
99216.37821316 -0.00000000 -0.00000000 1 # Si
99316.34634363 2.90187977 2.90187977 1 # Si
99416.34134070 0.16771494 5.63195305 1 # Si
99516.34134070 5.63195305 0.16771494 1 # Si
996-13.14107205 -7.67106057 -4.93773883 2 # H
997-13.12200813 -10.41265519 -2.22197561 2 # H
998-15.87081431 -7.68828624 -2.20095245 2 # H
999-14.47909953 -9.05114009 0.84828128 2 # H
1000-13.14542628 -10.40942624 3.25101117 2 # H
1001-13.12200813 -2.22197561 -10.41265519 2 # H
1002-13.14107205 -4.93773883 -7.67106057 2 # H
1003-15.87081431 -2.20095245 -7.68828624 2 # H
1004-15.87684776 -4.94293745 -4.94293745 2 # H
1005-14.48904263 -6.28831840 3.55914512 2 # H
1006-14.48904263 -3.55914512 6.28831840 2 # H
1007-13.14542628 -3.25101117 10.40942624 2 # H
1008-14.47909953 0.84828128 -9.05114009 2 # H
1009-13.14542628 3.25101117 -10.40942624 2 # H
1010-14.48904263 3.55914512 -6.28831840 2 # H
1011-15.87684776 4.94293745 4.94293745 2 # H
1012-14.47909953 -0.84828128 9.05114009 2 # H
1013-15.87081431 2.20095245 7.68828624 2 # H
1014-13.14107205 4.93773883 7.67106057 2 # H
1015-13.12200813 2.22197561 10.41265519 2 # H
1016-14.48904263 6.28831840 -3.55914512 2 # H
1017-13.14542628 10.40942624 -3.25101117 2 # H
1018-14.47909953 9.05114009 -0.84828128 2 # H
1019-15.87081431 7.68828624 2.20095245 2 # H
1020-13.12200813 10.41265519 2.22197561 2 # H
1021-13.14107205 7.67106057 4.93773883 2 # H
1022-7.67106057 -13.14107205 -4.93773883 2 # H
1023-7.68828624 -15.87081431 -2.20095245 2 # H
1024-10.41265519 -13.12200813 -2.22197561 2 # H
1025-9.05114009 -14.47909953 0.84828128 2 # H
1026-10.40942624 -13.14542628 3.25101117 2 # H
1027-7.67101940 -7.67101940 -10.40498818 2 # H
1028-7.67101940 -10.40498818 -7.67101940 2 # H
1029-10.40498818 -7.67101940 -7.67101940 2 # H
1030-10.40134885 -10.40134885 -4.95086615 2 # H
1031-11.77396546 -11.77396546 0.84562385 2 # H
1032-11.76622906 -11.76622906 6.31060507 2 # H
1033-9.04902753 -11.74957272 9.04902753 2 # H
1034-11.74957272 -9.04902753 9.04902753 2 # H
1035-9.04902753 -9.04902753 11.74957272 2 # H
1036-7.68828624 -2.20095245 -15.87081431 2 # H
1037-7.67106057 -4.93773883 -13.14107205 2 # H
1038-10.41265519 -2.22197561 -13.12200813 2 # H
1039-10.40134885 -4.95086615 -10.40134885 2 # H
1040-11.76622906 -6.31060507 11.76622906 2 # H
1041-10.40942624 -3.25101117 13.14542628 2 # H
1042-9.05114009 0.84828128 -14.47909953 2 # H
1043-10.40942624 3.25101117 -13.14542628 2 # H
1044-11.77396546 0.84562385 -11.77396546 2 # H
1045-10.40134885 4.95086615 10.40134885 2 # H
1046-11.77396546 -0.84562385 11.77396546 2 # H
1047-9.05114009 -0.84828128 14.47909953 2 # H
1048-10.41265519 2.22197561 13.12200813 2 # H
1049-7.67106057 4.93773883 13.14107205 2 # H
1050-7.68828624 2.20095245 15.87081431 2 # H
1051-11.76622906 6.31060507 -11.76622906 2 # H
1052-9.04902753 9.04902753 -11.74957272 2 # H
1053-11.74957272 9.04902753 -9.04902753 2 # H
1054-10.40134885 10.40134885 4.95086615 2 # H
1055-10.40498818 7.67101940 7.67101940 2 # H
1056-7.67101940 10.40498818 7.67101940 2 # H
1057-7.67101940 7.67101940 10.40498818 2 # H
1058-9.04902753 11.74957272 -9.04902753 2 # H
1059-11.76622906 11.76622906 -6.31060507 2 # H
1060-10.40942624 13.14542628 -3.25101117 2 # H
1061-11.77396546 11.77396546 -0.84562385 2 # H
1062-9.05114009 14.47909953 -0.84828128 2 # H
1063-10.41265519 13.12200813 2.22197561 2 # H
1064-7.68828624 15.87081431 2.20095245 2 # H
1065-7.67106057 13.14107205 4.93773883 2 # H
1066-2.22197561 -13.12200813 -10.41265519 2 # H
1067-2.20095245 -15.87081431 -7.68828624 2 # H
1068-4.93773883 -13.14107205 -7.67106057 2 # H
1069-4.94293745 -15.87684776 -4.94293745 2 # H
1070-6.28831840 -14.48904263 3.55914512 2 # H
1071-3.55914512 -14.48904263 6.28831840 2 # H
1072-3.25101117 -13.14542628 10.40942624 2 # H
1073-2.20095245 -7.68828624 -15.87081431 2 # H
1074-2.22197561 -10.41265519 -13.12200813 2 # H
1075-4.93773883 -7.67106057 -13.14107205 2 # H
1076-4.95086615 -10.40134885 -10.40134885 2 # H
1077-6.31060507 -11.76622906 11.76622906 2 # H
1078-3.25101117 -10.40942624 13.14542628 2 # H
1079-4.94293745 -4.94293745 -15.87684776 2 # H
1080-3.55914512 -6.28831840 14.48904263 2 # H
1081-6.28831840 -3.55914512 14.48904263 2 # H
1082-6.28831840 3.55914512 -14.48904263 2 # H
1083-4.94293745 4.94293745 15.87684776 2 # H
1084-3.55914512 6.28831840 -14.48904263 2 # H
1085-3.25101117 10.40942624 -13.14542628 2 # H
1086-4.95086615 10.40134885 10.40134885 2 # H
1087-4.93773883 7.67106057 13.14107205 2 # H
1088-2.22197561 10.41265519 13.12200813 2 # H
1089-2.20095245 7.68828624 15.87081431 2 # H
1090-6.31060507 11.76622906 -11.76622906 2 # H
1091-3.25101117 13.14542628 -10.40942624 2 # H
1092-3.55914512 14.48904263 -6.28831840 2 # H
1093-6.28831840 14.48904263 -3.55914512 2 # H
1094-4.94293745 15.87684776 4.94293745 2 # H
1095-4.93773883 13.14107205 7.67106057 2 # H
1096-2.20095245 15.87081431 7.68828624 2 # H
1097-2.22197561 13.12200813 10.41265519 2 # H
10980.84828128 -14.47909953 -9.05114009 2 # H
10993.25101117 -13.14542628 -10.40942624 2 # H
11003.55914512 -14.48904263 -6.28831840 2 # H
11014.94293745 -15.87684776 4.94293745 2 # H
1102-0.84828128 -14.47909953 9.05114009 2 # H
11032.20095245 -15.87081431 7.68828624 2 # H
11044.93773883 -13.14107205 7.67106057 2 # H
11052.22197561 -13.12200813 10.41265519 2 # H
11060.84828128 -9.05114009 -14.47909953 2 # H
11073.25101117 -10.40942624 -13.14542628 2 # H
11080.84562385 -11.77396546 -11.77396546 2 # H
11094.95086615 -10.40134885 10.40134885 2 # H
1110-0.84562385 -11.77396546 11.77396546 2 # H
1111-0.84828128 -9.05114009 14.47909953 2 # H
11122.22197561 -10.41265519 13.12200813 2 # H
11134.93773883 -7.67106057 13.14107205 2 # H
11142.20095245 -7.68828624 15.87081431 2 # H
11153.55914512 -6.28831840 -14.48904263 2 # H
11164.94293745 -4.94293745 15.87684776 2 # H
11174.94293745 4.94293745 -15.87684776 2 # H
1118-0.84828128 9.05114009 -14.47909953 2 # H
11192.20095245 7.68828624 -15.87081431 2 # H
11204.93773883 7.67106057 -13.14107205 2 # H
11212.22197561 10.41265519 -13.12200813 2 # H
11224.95086615 10.40134885 -10.40134885 2 # H
11233.55914512 6.28831840 14.48904263 2 # H
11240.84828128 9.05114009 14.47909953 2 # H
11253.25101117 10.40942624 13.14542628 2 # H
1126-0.84562385 11.77396546 -11.77396546 2 # H
1127-0.84828128 14.47909953 -9.05114009 2 # H
11282.22197561 13.12200813 -10.41265519 2 # H
11294.93773883 13.14107205 -7.67106057 2 # H
11302.20095245 15.87081431 -7.68828624 2 # H
11314.94293745 15.87684776 -4.94293745 2 # H
11323.55914512 14.48904263 6.28831840 2 # H
11330.84828128 14.47909953 9.05114009 2 # H
11343.25101117 13.14542628 10.40942624 2 # H
11350.84562385 11.77396546 11.77396546 2 # H
11366.28831840 -14.48904263 -3.55914512 2 # H
113710.40942624 -13.14542628 -3.25101117 2 # H
11389.05114009 -14.47909953 -0.84828128 2 # H
11397.68828624 -15.87081431 2.20095245 2 # H
114010.41265519 -13.12200813 2.22197561 2 # H
11417.67106057 -13.14107205 4.93773883 2 # H
11426.31060507 -11.76622906 -11.76622906 2 # H
11439.04902753 -9.04902753 -11.74957272 2 # H
11449.04902753 -11.74957272 -9.04902753 2 # H
114510.40134885 -10.40134885 4.95086615 2 # H
11467.67101940 -10.40498818 7.67101940 2 # H
114710.40498818 -7.67101940 7.67101940 2 # H
11487.67101940 -7.67101940 10.40498818 2 # H
11496.28831840 -3.55914512 -14.48904263 2 # H
115010.40942624 -3.25101117 -13.14542628 2 # H
115110.40134885 -4.95086615 10.40134885 2 # H
11527.67106057 -4.93773883 13.14107205 2 # H
115310.41265519 -2.22197561 13.12200813 2 # H
11547.68828624 -2.20095245 15.87081431 2 # H
11559.05114009 -0.84828128 -14.47909953 2 # H
11567.68828624 2.20095245 -15.87081431 2 # H
115710.41265519 2.22197561 -13.12200813 2 # H
11587.67106057 4.93773883 -13.14107205 2 # H
115910.40134885 4.95086615 -10.40134885 2 # H
11609.05114009 0.84828128 14.47909953 2 # H
11616.28831840 3.55914512 14.48904263 2 # H
116210.40942624 3.25101117 13.14542628 2 # H
11637.67101940 7.67101940 -10.40498818 2 # H
116410.40498818 7.67101940 -7.67101940 2 # H
11657.67101940 10.40498818 -7.67101940 2 # H
116610.40134885 10.40134885 -4.95086615 2 # H
11679.04902753 9.04902753 11.74957272 2 # H
11687.67106057 13.14107205 -4.93773883 2 # H
116910.41265519 13.12200813 -2.22197561 2 # H
11707.68828624 15.87081431 -2.20095245 2 # H
11719.05114009 14.47909953 0.84828128 2 # H
11726.28831840 14.48904263 3.55914512 2 # H
117310.40942624 13.14542628 3.25101117 2 # H
11749.04902753 11.74957272 9.04902753 2 # H
11756.31060507 11.76622906 11.76622906 2 # H
117611.74957272 -9.04902753 -9.04902753 2 # H
117711.76622906 -11.76622906 -6.31060507 2 # H
117813.14542628 -10.40942624 -3.25101117 2 # H
117911.77396546 -11.77396546 -0.84562385 2 # H
118014.47909953 -9.05114009 -0.84828128 2 # H
118113.12200813 -10.41265519 2.22197561 2 # H
118215.87081431 -7.68828624 2.20095245 2 # H
118313.14107205 -7.67106057 4.93773883 2 # H
118411.76622906 -6.31060507 -11.76622906 2 # H
118513.14542628 -3.25101117 -10.40942624 2 # H
118614.48904263 -3.55914512 -6.28831840 2 # H
118714.48904263 -6.28831840 -3.55914512 2 # H
118815.87684776 -4.94293745 4.94293745 2 # H
118913.14107205 -4.93773883 7.67106057 2 # H
119015.87081431 -2.20095245 7.68828624 2 # H
119113.12200813 -2.22197561 10.41265519 2 # H
119211.77396546 -0.84562385 -11.77396546 2 # H
119314.47909953 -0.84828128 -9.05114009 2 # H
119413.12200813 2.22197561 -10.41265519 2 # H
119515.87081431 2.20095245 -7.68828624 2 # H
119613.14107205 4.93773883 -7.67106057 2 # H
119715.87684776 4.94293745 -4.94293745 2 # H
119814.48904263 3.55914512 6.28831840 2 # H
119914.47909953 0.84828128 9.05114009 2 # H
120013.14542628 3.25101117 10.40942624 2 # H
120111.77396546 0.84562385 11.77396546 2 # H
120213.14107205 7.67106057 -4.93773883 2 # H
120315.87081431 7.68828624 -2.20095245 2 # H
120413.12200813 10.41265519 -2.22197561 2 # H
120514.47909953 9.05114009 0.84828128 2 # H
120614.48904263 6.28831840 3.55914512 2 # H
120713.14542628 10.40942624 3.25101117 2 # H
120811.74957272 9.04902753 9.04902753 2 # H
120911.76622906 6.31060507 11.76622906 2 # H
121011.77396546 11.77396546 0.84562385 2 # H
121111.76622906 11.76622906 6.31060507 2 # H
1212-14.25049869 -9.30903112 6.56976732 2 # H
1213-14.73854562 -7.62887682 4.90661761 2 # H
1214-17.26005319 -3.58592496 -1.88119305 2 # H
1215-17.26005319 -1.88119305 -3.58592496 2 # H
1216-16.92458989 -6.60756054 1.14534264 2 # H
1217-17.48851584 -4.95538862 -0.51858345 2 # H
1218-16.92565018 -3.87958702 3.87958702 2 # H
1219-17.49703350 -2.22300603 2.22300603 2 # H
1220-14.25049869 -6.56976732 9.30903112 2 # H
1221-14.73854562 -4.90661761 7.62887682 2 # H
1222-17.48851584 -0.51858345 -4.95538862 2 # H
1223-16.92458989 1.14534264 -6.60756054 2 # H
1224-17.49703350 2.22300603 -2.22300603 2 # H
1225-16.92565018 3.87958702 -3.87958702 2 # H
1226-17.26821617 -0.85235208 0.85235208 2 # H
1227-17.26821617 0.85235208 -0.85235208 2 # H
1228-17.26005319 1.88119305 3.58592496 2 # H
1229-17.26005319 3.58592496 1.88119305 2 # H
1230-16.92458989 -1.14534264 6.60756054 2 # H
1231-17.48851584 0.51858345 4.95538862 2 # H
1232-14.73854562 4.90661761 -7.62887682 2 # H
1233-14.25049869 6.56976732 -9.30903112 2 # H
1234-14.73854562 7.62887682 -4.90661761 2 # H
1235-14.25049869 9.30903112 -6.56976732 2 # H
1236-17.48851584 4.95538862 0.51858345 2 # H
1237-16.92458989 6.60756054 -1.14534264 2 # H
1238-7.62887682 -14.73854562 4.90661761 2 # H
1239-9.30903112 -14.25049869 6.56976732 2 # H
1240-7.62887682 -4.90661761 14.73854562 2 # H
1241-9.30903112 -6.56976732 14.25049869 2 # H
1242-9.30903112 6.56976732 -14.25049869 2 # H
1243-7.62887682 4.90661761 -14.73854562 2 # H
1244-9.30903112 14.25049869 -6.56976732 2 # H
1245-7.62887682 14.73854562 -4.90661761 2 # H
1246-1.88119305 -17.26005319 -3.58592496 2 # H
1247-3.58592496 -17.26005319 -1.88119305 2 # H
1248-4.95538862 -17.48851584 -0.51858345 2 # H
1249-6.60756054 -16.92458989 1.14534264 2 # H
1250-2.22300603 -17.49703350 2.22300603 2 # H
1251-3.87958702 -16.92565018 3.87958702 2 # H
1252-4.90661761 -14.73854562 7.62887682 2 # H
1253-6.56976732 -14.25049869 9.30903112 2 # H
1254-6.56976732 -9.30903112 14.25049869 2 # H
1255-4.90661761 -7.62887682 14.73854562 2 # H
1256-3.58592496 -1.88119305 -17.26005319 2 # H
1257-1.88119305 -3.58592496 -17.26005319 2 # H
1258-2.22300603 -2.22300603 17.49703350 2 # H
1259-3.87958702 -3.87958702 16.92565018 2 # H
1260-6.60756054 1.14534264 -16.92458989 2 # H
1261-4.95538862 -0.51858345 -17.48851584 2 # H
1262-3.87958702 3.87958702 -16.92565018 2 # H
1263-2.22300603 2.22300603 -17.49703350 2 # H
1264-6.60756054 -1.14534264 16.92458989 2 # H
1265-4.95538862 0.51858345 17.48851584 2 # H
1266-1.88119305 3.58592496 17.26005319 2 # H
1267-3.58592496 1.88119305 17.26005319 2 # H
1268-6.56976732 9.30903112 -14.25049869 2 # H
1269-4.90661761 7.62887682 -14.73854562 2 # H
1270-4.90661761 14.73854562 -7.62887682 2 # H
1271-6.56976732 14.25049869 -9.30903112 2 # H
1272-3.87958702 16.92565018 -3.87958702 2 # H
1273-2.22300603 17.49703350 -2.22300603 2 # H
1274-4.95538862 17.48851584 0.51858345 2 # H
1275-6.60756054 16.92458989 -1.14534264 2 # H
1276-3.58592496 17.26005319 1.88119305 2 # H
1277-1.88119305 17.26005319 3.58592496 2 # H
12781.14534264 -16.92458989 -6.60756054 2 # H
1279-0.51858345 -17.48851584 -4.95538862 2 # H
12803.87958702 -16.92565018 -3.87958702 2 # H
12812.22300603 -17.49703350 -2.22300603 2 # H
12820.85235208 -17.26821617 -0.85235208 2 # H
1283-0.85235208 -17.26821617 0.85235208 2 # H
12843.58592496 -17.26005319 1.88119305 2 # H
12851.88119305 -17.26005319 3.58592496 2 # H
12860.51858345 -17.48851584 4.95538862 2 # H
1287-1.14534264 -16.92458989 6.60756054 2 # H
1288-0.51858345 -4.95538862 -17.48851584 2 # H
12891.14534264 -6.60756054 -16.92458989 2 # H
12902.22300603 -2.22300603 -17.49703350 2 # H
12913.87958702 -3.87958702 -16.92565018 2 # H
1292-1.14534264 -6.60756054 16.92458989 2 # H
12930.51858345 -4.95538862 17.48851584 2 # H
12943.58592496 -1.88119305 17.26005319 2 # H
12951.88119305 -3.58592496 17.26005319 2 # H
1296-0.85235208 0.85235208 -17.26821617 2 # H
12970.85235208 -0.85235208 -17.26821617 2 # H
12981.88119305 3.58592496 -17.26005319 2 # H
12993.58592496 1.88119305 -17.26005319 2 # H
1300-0.85235208 -0.85235208 17.26821617 2 # H
13010.85235208 0.85235208 17.26821617 2 # H
13023.87958702 3.87958702 16.92565018 2 # H
13032.22300603 2.22300603 17.49703350 2 # H
1304-1.14534264 6.60756054 -16.92458989 2 # H
13050.51858345 4.95538862 -17.48851584 2 # H
1306-0.51858345 4.95538862 17.48851584 2 # H
13071.14534264 6.60756054 16.92458989 2 # H
13080.51858345 17.48851584 -4.95538862 2 # H
1309-1.14534264 16.92458989 -6.60756054 2 # H
13101.88119305 17.26005319 -3.58592496 2 # H
13113.58592496 17.26005319 -1.88119305 2 # H
13120.85235208 17.26821617 0.85235208 2 # H
1313-0.85235208 17.26821617 -0.85235208 2 # H
13142.22300603 17.49703350 2.22300603 2 # H
13153.87958702 16.92565018 3.87958702 2 # H
13161.14534264 16.92458989 6.60756054 2 # H
1317-0.51858345 17.48851584 4.95538862 2 # H
13186.56976732 -14.25049869 -9.30903112 2 # H
13194.90661761 -14.73854562 -7.62887682 2 # H
13209.30903112 -14.25049869 -6.56976732 2 # H
13217.62887682 -14.73854562 -4.90661761 2 # H
13226.60756054 -16.92458989 -1.14534264 2 # H
13234.95538862 -17.48851584 0.51858345 2 # H
13244.90661761 -7.62887682 -14.73854562 2 # H
13256.56976732 -9.30903112 -14.25049869 2 # H
13267.62887682 -4.90661761 -14.73854562 2 # H
13279.30903112 -6.56976732 -14.25049869 2 # H
13284.95538862 0.51858345 -17.48851584 2 # H
13296.60756054 -1.14534264 -16.92458989 2 # H
13304.95538862 -0.51858345 17.48851584 2 # H
13316.60756054 1.14534264 16.92458989 2 # H
13329.30903112 6.56976732 14.25049869 2 # H
13337.62887682 4.90661761 14.73854562 2 # H
13344.90661761 7.62887682 14.73854562 2 # H
13356.56976732 9.30903112 14.25049869 2 # H
13367.62887682 14.73854562 4.90661761 2 # H
13379.30903112 14.25049869 6.56976732 2 # H
13386.56976732 14.25049869 9.30903112 2 # H
13394.90661761 14.73854562 7.62887682 2 # H
13406.60756054 16.92458989 1.14534264 2 # H
13414.95538862 17.48851584 -0.51858345 2 # H
134214.73854562 -7.62887682 -4.90661761 2 # H
134314.25049869 -9.30903112 -6.56976732 2 # H
134414.25049869 -6.56976732 -9.30903112 2 # H
134514.73854562 -4.90661761 -7.62887682 2 # H
134614.25049869 9.30903112 6.56976732 2 # H
134714.73854562 7.62887682 4.90661761 2 # H
134814.73854562 4.90661761 7.62887682 2 # H
134914.25049869 6.56976732 9.30903112 2 # H
135017.49703350 -2.22300603 -2.22300603 2 # H
135116.92565018 -3.87958702 -3.87958702 2 # H
135216.92458989 -6.60756054 -1.14534264 2 # H
135317.48851584 -4.95538862 0.51858345 2 # H
135417.26005319 -1.88119305 3.58592496 2 # H
135517.26005319 -3.58592496 1.88119305 2 # H
135616.92458989 -1.14534264 -6.60756054 2 # H
135717.48851584 0.51858345 -4.95538862 2 # H
135817.26005319 3.58592496 -1.88119305 2 # H
135917.26005319 1.88119305 -3.58592496 2 # H
136017.26821617 -0.85235208 -0.85235208 2 # H
136117.26821617 0.85235208 0.85235208 2 # H
136216.92565018 3.87958702 3.87958702 2 # H
136317.49703350 2.22300603 2.22300603 2 # H
136417.48851584 -0.51858345 4.95538862 2 # H
136516.92458989 1.14534264 6.60756054 2 # H
136617.48851584 4.95538862 -0.51858345 2 # H
136716.92458989 6.60756054 1.14534264 2 # H
1368%endblock atomic-coordinates-and-atomic-species
01369
=== added file 'Tests/si-quantum-dot/makefile'
--- Tests/si-quantum-dot/makefile 1970-01-01 00:00:00 +0000
+++ Tests/si-quantum-dot/makefile 2019-01-16 11:00:58 +0000
@@ -0,0 +1,7 @@
1#
2# Single-test makefile
3#
4name=si-quantum-dot
5EXTRAFILES=coords.Si987H372.fdf
6#
7include ../test.mk
08
=== added file 'Tests/si-quantum-dot/si-quantum-dot.fdf'
--- Tests/si-quantum-dot/si-quantum-dot.fdf 1970-01-01 00:00:00 +0000
+++ Tests/si-quantum-dot/si-quantum-dot.fdf 2019-01-16 11:00:58 +0000
@@ -0,0 +1,39 @@
1use-tree-timer T
2SolutionMethod ELSI
3elsi-solver elpa
4
5WriteDMHS.NetCDF F
6WriteDM.NetCDF F
7WriteDM T
8
9SystemName Si Quantum Dot
10SystemLabel si-quantum-dot
11
12%block ChemicalSpeciesLabel
13 1 14 Si
14 2 1 H
15%endblock ChemicalSpeciesLabel
16
17# CSIRO quantum dot
18%include coords.Si987H372.fdf
19
20PAO.BasisSize SZ
21PAO.EnergyShift 300 meV
22
23#
24LatticeConstant 45.0 Ang
25%block LatticeVectors
26 1.000 0.000 0.000
27 0.000 1.000 0.000
28 0.000 0.000 1.000
29%endblock LatticeVectors
30
31MeshCutoff 150.0 Ry
32
33MaxSCFIterations 40
34DM.MixingWeight 0.3
35DM.NumberPulay 4
36DM.Tolerance 1.d-3
37DM.UseSaveDM
38
39ElectronicTemperature 25 meV
040
=== added file 'Tests/si-quantum-dot/si-quantum-dot.pseudos'
--- Tests/si-quantum-dot/si-quantum-dot.pseudos 1970-01-01 00:00:00 +0000
+++ Tests/si-quantum-dot/si-quantum-dot.pseudos 2019-01-16 11:00:58 +0000
@@ -0,0 +1,1 @@
1Si H
02
=== added directory 'Tests/sih-elsi'
=== added file 'Tests/sih-elsi/makefile'
--- Tests/sih-elsi/makefile 1970-01-01 00:00:00 +0000
+++ Tests/sih-elsi/makefile 2019-01-16 11:00:58 +0000
@@ -0,0 +1,6 @@
1#
2# Single-test makefile
3#
4name=sih-elsi
5#
6include ../test.mk
07
=== added file 'Tests/sih-elsi/sih-elsi.fdf'
--- Tests/sih-elsi/sih-elsi.fdf 1970-01-01 00:00:00 +0000
+++ Tests/sih-elsi/sih-elsi.fdf 2019-01-16 11:00:58 +0000
@@ -0,0 +1,121 @@
1# -----------------------------------------------------------------------------
2# FDF for interstitial H in a cubic c-Si supercell with 64 atoms
3#
4# E. Artacho, April 1999
5# -----------------------------------------------------------------------------
6
7#-------
8## compute-forces F
9SolutionMethod ELSI
10#MPI.Nprocs.SIESTA 1
11
12WriteDMHS.NetCDF F
13WriteDM.NetCDF F
14WriteDM T
15
16SystemName H in 64-atom silicon - ELSI solver
17SystemLabel sih-elsi
18
19NumberOfAtoms 65
20NumberOfSpecies 2
21
22%block ChemicalSpeciesLabel
23 1 14 Si
24 2 1 H
25%endblock ChemicalSpeciesLabel
26
27PAO.BasisSize DZP
28PAO.EnergyShift 300 meV
29
30%block PAO.Basis
31 H 1
32 0 1
33 0.
34 0.8
35%endblock PAO.Basis
36
37LatticeConstant 5.430 Ang
38%block LatticeVectors
39 2.000 0.000 0.000
40 0.000 2.000 0.000
41 0.000 0.000 2.000
42%endblock LatticeVectors
43
44MeshCutoff 100.0 Ry
45
46MaxSCFIterations 20
47DM.MixingWeight 0.3
48DM.NumberPulay 4
49DM.Tolerance 1.d-3
50DM.UseSaveDM
51
52ElectronicTemperature 25 meV
53
54AtomicCoordinatesFormat ScaledCartesian
55%block AtomicCoordinatesAndAtomicSpecies
56 0.000 0.000 0.000 1 # Si 1
57 0.250 0.250 0.250 1 # Si 2
58 0.000 0.500 0.500 1 # Si 3
59 0.250 0.750 0.750 1 # Si 4
60 0.500 0.000 0.500 1 # Si 5
61 0.750 0.250 0.750 1 # Si 6
62 0.500 0.500 0.000 1 # Si 7
63 0.750 0.750 0.250 1 # Si 8
64 1.000 0.000 0.000 1 # Si 9
65 1.250 0.250 0.250 1 # Si 10
66 1.000 0.500 0.500 1 # Si 11
67 1.250 0.750 0.750 1 # Si 12
68 1.500 0.000 0.500 1 # Si 13
69 1.750 0.250 0.750 1 # Si 14
70 1.500 0.500 0.000 1 # Si 15
71 1.750 0.750 0.250 1 # Si 16
72 0.000 1.000 0.000 1 # Si 17
73 0.250 1.250 0.250 1 # Si 18
74 0.000 1.500 0.500 1 # Si 19
75 0.250 1.750 0.750 1 # Si 20
76 0.500 1.000 0.500 1 # Si 21
77 0.750 1.250 0.750 1 # Si 22
78 0.500 1.500 0.000 1 # Si 23
79 0.750 1.750 0.250 1 # Si 24
80 0.000 0.000 1.000 1 # Si 25
81 0.250 0.250 1.250 1 # Si 26
82 0.000 0.500 1.500 1 # Si 27
83 0.250 0.750 1.750 1 # Si 28
84 0.500 0.000 1.500 1 # Si 29
85 0.750 0.250 1.750 1 # Si 30
86 0.500 0.500 1.000 1 # Si 31
87 0.750 0.750 1.250 1 # Si 32
88 1.000 1.000 0.000 1 # Si 33
89 1.250 1.250 0.250 1 # Si 34
90 1.000 1.500 0.500 1 # Si 35
91 1.250 1.750 0.750 1 # Si 36
92 1.500 1.000 0.500 1 # Si 37
93 1.750 1.250 0.750 1 # Si 38
94 1.500 1.500 0.000 1 # Si 39
95 1.750 1.750 0.250 1 # Si 40
96 1.000 0.000 1.000 1 # Si 41
97 1.250 0.250 1.250 1 # Si 42
98 1.000 0.500 1.500 1 # Si 43
99 1.250 0.750 1.750 1 # Si 44
100 1.500 0.000 1.500 1 # Si 45
101 1.750 0.250 1.750 1 # Si 46
102 1.500 0.500 1.000 1 # Si 47
103 1.750 0.750 1.250 1 # Si 48
104 0.000 1.000 1.000 1 # Si 49
105 0.250 1.250 1.250 1 # Si 50
106 0.000 1.500 1.500 1 # Si 51
107 0.250 1.750 1.750 1 # Si 52
108 0.500 1.000 1.500 1 # Si 53
109 0.750 1.250 1.750 1 # Si 54
110 0.500 1.500 1.000 1 # Si 55
111 0.750 1.750 1.250 1 # Si 56
112 1.000 1.000 1.000 1 # Si 57
113 1.250 1.250 1.250 1 # Si 58
114 1.000 1.500 1.500 1 # Si 59
115 1.250 1.750 1.750 1 # Si 60
116 1.500 1.000 1.500 1 # Si 61
117 1.750 1.250 1.750 1 # Si 62
118 1.500 1.500 1.000 1 # Si 63
119 1.750 1.750 1.250 1 # Si 64
120 1.125 1.125 1.125 2 # H 65
121%endblock AtomicCoordinatesAndAtomicSpecies
0122
=== added file 'Tests/sih-elsi/sih-elsi.pseudos'
--- Tests/sih-elsi/sih-elsi.pseudos 1970-01-01 00:00:00 +0000
+++ Tests/sih-elsi/sih-elsi.pseudos 2019-01-16 11:00:58 +0000
@@ -0,0 +1,1 @@
1Si H
02
=== modified file 'version.info'
--- version.info 2019-01-12 22:35:38 +0000
+++ version.info 2019-01-16 11:00:58 +0000
@@ -1,2 +1,7 @@
1<<<<<<< TREE
1trunk-7552trunk-755
23
4=======
5trunk-710--elsi-dm-695
6
7>>>>>>> MERGE-SOURCE

Subscribers

People subscribed via source and target branches