patternpythonMinor
Read output from a subprocess executed by Python
Viewed 0 times
readsubprocessoutputexecutedpythonfrom
Problem
I've written these 2 methods and I'm wondering if the 2nd method, which uses yield is more efficient or if I'm using yield wrong. Can someone refactor the 2nd method to make it more efficient for reading system os calls with large output?
def exec_cmd(command):
proc = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout, stderr = proc.communicate()
return_code = proc.returncode
if stderr:
raise Error(stdout, stderr)
if return_code:
raise Error(stdout, return_code)
return stdout, return_code
def exec_cmd_output_line_by_line(command):
proc = subprocess.Popen(command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
return_code = proc.returncode
if return_code:
raise Error(proc.stdout, return_code)
for line in proc.stdout.readlines():
yield line.strip()Solution
First of all, you have a bug. You gave
It is difficult to do a comparative review because the two functions do not do quite the same thing. I will, however, do a review of the second function.
I don't see a need to create a temporary
You don't account for the cases when there is an error. What you used in the first function looks pretty good. I would use a different exception class so that functions that call this function can handle the different types of exceptions in different ways.
File objects are iterable. That means that your last loop can be shortened to this:
That prevents reading all of the lines into memory. You could also use
Another option is a generator expression:
stderr=subprocess.STDOUT. That means exactly what it says: stderr will be redirected to stdout. Because of that, stderr will always be None and stdout will be assigned to whatever the errors are. Test it yourself with something like ls --qaz. What you should have said is stderr=subprocess.PIPE.It is difficult to do a comparative review because the two functions do not do quite the same thing. I will, however, do a review of the second function.
I don't see a need to create a temporary
return_code variable. Just check if proc.returncode: instead.You don't account for the cases when there is an error. What you used in the first function looks pretty good. I would use a different exception class so that functions that call this function can handle the different types of exceptions in different ways.
File objects are iterable. That means that your last loop can be shortened to this:
for line in proc.stdout:
yield line.strip()That prevents reading all of the lines into memory. You could also use
map(), though it wouldn't be very efficient in Python 2 (because it creates a complete list instead of a generator):return map(str.strip, proc.stdout)Another option is a generator expression:
return (line.strip() for line in proc.stdout)Code Snippets
for line in proc.stdout:
yield line.strip()return map(str.strip, proc.stdout)return (line.strip() for line in proc.stdout)Context
StackExchange Code Review Q#128710, answer score: 3
Revisions (0)
No revisions yet.