Skip to content

pass error message from hadoop to user #7

@bjuergens

Description

@bjuergens

current:

Traceback (most recent call last):
  File "/gpfs/smartdata/ugfam/dev/dirhash/dirhash.py", line 620, in <module>
    _main(sys.argv)
  File "/gpfs/smartdata/ugfam/dev/dirhash/dirhash.py", line 609, in _main
    h = hash_directory(args.dir, args.hash_algo, args.blocksize)
  File "/gpfs/smartdata/ugfam/dev/dirhash/dirhash.py", line 455, in hash_directory
    raw_hash = hash_directory_raw(dir, algo, _parse_blocksize(blocksize), sparkcontext)
  File "/gpfs/smartdata/ugfam/dev/dirhash/dirhash.py", line 327, in hash_directory_raw
    return hash_directory_raw(dir, algo, blocksize, sc)
  File "/gpfs/smartdata/ugfam/dev/dirhash/dirhash.py", line 339, in hash_directory_raw
    stderr=subprocess.PIPE
  File "/usr/lib64/python2.7/subprocess.py", line 575, in check_output
    raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command '['hadoop', 'fs', '-ls', '-R', '/gpfs/smartdata/ugfam/dev/test_data']' returned non-zero exit status 1

when you run the internal command manually, you get a much better message:

hadoop fs -ls -R /gpfs/smartdata/ugfam/dev/test_data
ls: `/gpfs/smartdata/ugfam/dev/test_data': No such file or directory

This internal message should be passed to the user

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions