Thanks for reporting the problem and fixing the bug.
I have to say that the two options eph_task = -2, 2 are not very well documented.
Let's try to clarify the main purpose of these two options.
eph_task = +2 is mainly used to produce a netcdf file with the matrix elements <k+q,b'|d/d_p V_scf| k, b>
associated to a single atomic perturbation p where p is a shorthand notation for (qpt, idir, ipert).
Although the file produced by this option has extension GKK.nc, this file **cannot** be used by the e-ph part of anaddb.
The main purpose of this option is to generate netcdf files for
the [ElectronPhononCoupling](
https://github.com/GkAntonius/ElectronPhononCoupling) python package
that computes the e-ph self-energy and QP corrections.
Some of these features are now implemented in Abinit by eph_task + 4
(see e.g. the [eph4zpr](
https://docs.abinit.org/tutorial/eph4zpr/) tutorial).
eph_task = -2, on the other hand, produces a GKQ.nc file with the same e-ph matrix elements in the atomic representation but in a slightly different format.
The main purpose of this task is to generate a netcdf file with e-ph matrix elements with q-points
along a high-symmetry path
so that one can then use abipy to visualize |g(k, q, nu, b, b')| as a function of q for fixed k.
We usually perform N_qpath calculations in which we pass a different WKQ file computed on a shifted k+q mesh.
Note that one has to average the e-ph matrix elements over degenerate phonon modes and degenerate |k+q, b'>, |k, b> bands
(if any) in order to obtain a gauge-independent quantity.
We don't use this option very often because for our recent studies we found that it is much easier to look at the
unit-cell averaged lattice-periodic part of the scattering potential or to Fourier components with G /= 0.
See e.g. Eq 4 and Fig 2 of
https://journals.aps.org/prl/pdf/10.110 ... 125.136601 and the SI.
The scattering potential is indeed gauge-independent and important features such as the Frohlich divergence or the jump discontinuity for q --> 0 in semiconductors due to dynamical quadrupoles are clearly visible at the level of the scattering potential
without having to plot |g(q)|^2 directly.
This task corresponds to eph_task +15 and we usually use abipy tools to plot the results.
To summarize, eph_task = +2 is mainly used to interface abinit with a ElectronPhononCoupling for ZPR calculations
while eph_task -2 is used for visualizing the q-dependence of the e-ph matrix elements.
eph_task +15 (analysis of the scattering potentials in q-space) is much easier to use since there's no gauge in the potentials
but this kind of visualization does not give any insight into the contribution to the e-ph matrix elements
coming from the Bloch orbitals. For instance, you may have that particular e-ph matrix elements are zero
due to the symmetry of the scattering potential and of the |k+q>, |k> states but this effect won't show up
if you plot the scattering potential alone.
I looked at the abipy workflow since I can't find it in the documentation elsewhere.
It looks like doing an nscf calculation at the relevant q-point is the way to do it.
It seems unnecessary since the respfn code is able to tell if the q-point is commensurate with the k-point grid.
This step is unecessary if your goal is to have <k',b'|\Delta_{q\nu} V}|k, b> for all k, k' belonging to the ngkpt k-mesh associated to the input WFK. It is needed if you want to compute <k+q,b'|\Delta_{q\nu} V}|k, b> when
q belongs to a high-symmetry path.
The AbiPy workflow was designed for the later case i.e. visualization purposes.
I want the gkq elements in the full BZ,
Why do you need the gkq elements in the full BZ?
For visualization purposes or for computing integrals in the BZ?
The EPH code is in principe able to compute g(k, q) for all k and k'=k+q in the full BZ once you provide an input WFK file
and set eph_ngqp_file == ngkpt but there's no eph_task that is explicitly designed with this kind of operation in mind.
When we compute g(k, q) for all the k- and q-points in the "BZ", we assume the user wants to obtain physical observables
such as QP energies, phonon linewidths, etc but we never precompute and store all the e-ph matrix elements
in memory for performance reasons.
The BZ integration in k/q space of the |g|^2 terms is always done on the fly, possibly by multiple MPI processes.
Only the final results are presented to the user.
On the one hand, I understand that having the full g(k, q, ...) array stored on disk can be useful
if you want to produce cool 2d or 3d plots with |g(k, q)|^2 for fixed k/q or if you need to read the e-ph matrix elements to implement some kind of post-processing step in a high-level language.
On the other hand, writing all this stuff in parallel is not trivial and that's the reason why I would like
to have a better understanding of your problem.
which Abinit apparently all ready uses to calculate phonon linewidths on an arbitrary q-path.
I just don't seem to be able to access the gkq elements.
Actually the phonon linewidths are interpolated using a Fourier-interpolation scheme similar in spirit to the one used for the dynamical matrix.
In a nuthshell, the code computes the phonon linewidths gamma(q) in the IBZ (actually a tensor from which one
can obtain the ph-linewidths but this is not relevant for the discussion).
Then we "rotate" the gamma(q) terms to get the values in the BZ. At this point we can do a q --> R Fourier transform to get gamma(R) in real space and then use the inverse transform to *Fourier interpolate* the signal at arbitrary q-points.
Note that e-ph matrix elements are not needed at this level.
You can interpolate the scattering potentials to obtain gamma(q) on a q-grid that is much denser than the one used at the DFPT level but the q-points still belong to a regular q-mesh so you need an extra level of interpolation
to go from the q-mesh to an *arbitrary* q-path.
I hope this addresses some of the questions raised in your previous posts.
Unfortunately, you are exploring parts of the EPH code that have been mainly used for our internal studies.
Some of these features are not fully documented and the API as well as the netcdf file formats may change in the near future.
We plan to write a tutorial explaining how to perform this kind of analysis but before doing that we need to polish a bit
our python scripts so that they can be easily used by other users.