why my ergm always stuck... #275
Replies: 6 comments
-
stuck at
can't get the result |
Beta Was this translation helpful? Give feedback.
-
We've found that the |
Beta Was this translation helpful? Give feedback.
-
thank for your reply ,PSOCK works fine, did the old verison openmpi support the ergm for parallel , i am caculating with a very large data, maybe i need the parallel for cluster MPI |
Beta Was this translation helpful? Give feedback.
-
I'm not sure what you are asking. But I use PSOCK for parallel cluster estimation for ergm. |
Beta Was this translation helpful? Give feedback.
-
I mean, it seems that psock can only work on one machine. I hope it can run in a cluster of multiple machines |
Beta Was this translation helpful? Give feedback.
-
I am going to convert this to a discussion, since this seems to be more of a QA. |
Beta Was this translation helpful? Give feedback.
-
Starting maximum pseudolikelihood estimation (MPLE):
Evaluating the predictor and response matrix.
Maximizing the pseudolikelihood.
Finished MPLE.
2 slaves are spawned successfully. 0 failed.
Starting Monte Carlo maximum likelihood estimation (MCMLE):
Iteration 1 of at most 20:
Optimizing with step length 0.909764142612762.
The log-likelihood improved by 4.088.
Iteration 2 of at most 20:
Optimizing with step length 1.
The log-likelihood improved by 0.6494.
Step length converged once. Increasing MCMC sample size.
Iteration 3 of at most 20:
Optimizing with step length 1.
The log-likelihood improved by 0.07634.
Step length converged twice. Stopping.
Finished MCMLE.
Evaluating log-likelihood at the estimate. Using 20 bridges: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 .
This model was fit using MCMC. To examine model diagnostics and check
for degeneracy, use the mcmc.diagnostics() function.
the openmpi seems work normally
[admin@slave06 examples]$ mpirun -np 32 hello_c
Hello, world, I am 1 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 9 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 11 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 3 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 8 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 12 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 22 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 18 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 0 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 2 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 4 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 5 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 6 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 30 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 15 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 17 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 19 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 20 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 21 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 23 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 7 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 13 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 14 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 16 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 28 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 10 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave06 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 26 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 29 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 24 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 25 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 27 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
Hello, world, I am 31 of 32, (Open MPI v4.0.5, package: Open MPI admin@slave05 Distribution, ident: 4.0.5, repo rev: v4.0.5, Aug 26, 2020, 108)
[admin@slave06 examples]$ mpirun -np 32 connectivity_c
Connectivity test on 32 processes PASSED.
why the ergm can not excute with MPI?
env:
ucx-1.9.0
hwloc-2.3.0
libevent-2.1.12-stable
openmpi-4.0.5
pmix-3.2.3
--prefix=/data1/openmpi/' '--with-pmix=/data1/openmpi/pmix' '--with-libevent=/data1/openmpi/libevent' '--with-hwloc=/data1/openmpi/hwloc' '--with-ucx=/data1/openmpi/ucx'
Beta Was this translation helpful? Give feedback.
All reactions