Skip to content

Segmentation fault in distributed-memory version with MPI ranks >= 1K #40

@nfabubaker

Description

@nfabubaker

There's a seg. fault when running the mpi version with1K processors or more.

Config command:
./configure --with-mpi

Run command:
mpiexec -np 1024 splatt cpd enron.tns -t 1 -r 16

The input tensor can be found here

Some trace info:

in splatt_tt_get_slices (tt=0x5810640, m=0, nunique=0x7ffe7d607c20)
in p_greedy_mat_distribution (rinfo=0x7ffe7d607dd0, tt=0x5810640, perm=0x5a96b80)
splatt/src/mpi/mpi_mat_distribute.c:464 in splatt_mpi_distribute_mats (rinfo=0x7ffe7d607dd0, tt=0x5810640, distribution=SPLATT_DECOMP_MEDIUM)
splatt/src/mpi/mpi_mat_distribute.c:616 in splatt_mpi_cpd_cmd (argc=8, argv=0x7ffe7d6085f0)
splatt/src/cmds/mpi_cmd_cpd.c:219 in main (argc=9, argv=0x7ffe7d6085e8)

Metadata

Metadata

Assignees

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions