concurrency - Are map() and reduce() appropriate for concurrent processing in Go? -
coming python background, , starting go, found myself looking equivalent of map() , reduce() functions in go. didn't find them, fell on loops. example, used instead of map(), mapfunction defined elsewhere:
data := make([]byte, 1024) count, err := input.read(data) // error handling removed snippet i:=0; i<count; i++ { data[i] = mapfunction(data[i]) }
and used instead of reduce(), there 2 state variables i'm using keep track of quoting of fields in csv code moves through each item in slice:
data := make([]byte, 1024) count, err := input.read(data) // error handling removed snippet i:=0; i<count; i++ { data[i], statevariable1, statevariable2 = reducefunction(data[i], statevariable1, statevariable2) }
here questions:
- are there builtin capabilties missed?
- is appropriate use mutable slices each of these?
- would idea use goroutines map()? allow decoupling of io operation read file , process run mapping function on each item, , therefore allow parallelization?
- is correct goroutines not appropriate reduce() function because 2 state variables defined of preceding data, , must proceed sequentially. in other words, sequential process cannot benefit concurrent architecture?
thanks!
ps - full code here: https://github.com/dbro/csvquote/blob/go/csvquote.go
in short:
- no, there no builtin map or reduce.
- yes. else?
- no. not think such stuff without prior measuring or proven real need.
- yes.
a bit longer.
- go not functional, no map/reduce builtins or in standard library
- there array , slices in go. both mutable. slices natural choice of time.
- premature optimization... , of course: reading processing go 1 loop , wrapping input in bufio.reader idea.
- goroutines nice, allow different type of program construction, not mean used everything. there no need complicate clear loop introducing goroutines.
Comments
Post a Comment