concurrency - Are map() and reduce() appropriate for concurrent processing in Go? -


coming python background, , starting go, found myself looking equivalent of map() , reduce() functions in go. didn't find them, fell on loops. example, used instead of map(), mapfunction defined elsewhere:

data := make([]byte, 1024) count, err := input.read(data) // error handling removed snippet i:=0; i<count; i++ {     data[i] = mapfunction(data[i]) } 

and used instead of reduce(), there 2 state variables i'm using keep track of quoting of fields in csv code moves through each item in slice:

data := make([]byte, 1024) count, err := input.read(data) // error handling removed snippet i:=0; i<count; i++ {     data[i], statevariable1, statevariable2 =         reducefunction(data[i], statevariable1, statevariable2) } 

here questions:

  1. are there builtin capabilties missed?
  2. is appropriate use mutable slices each of these?
  3. would idea use goroutines map()? allow decoupling of io operation read file , process run mapping function on each item, , therefore allow parallelization?
  4. is correct goroutines not appropriate reduce() function because 2 state variables defined of preceding data, , must proceed sequentially. in other words, sequential process cannot benefit concurrent architecture?

thanks!

ps - full code here: https://github.com/dbro/csvquote/blob/go/csvquote.go

in short:

  1. no, there no builtin map or reduce.
  2. yes. else?
  3. no. not think such stuff without prior measuring or proven real need.
  4. yes.

a bit longer.

  1. go not functional, no map/reduce builtins or in standard library
  2. there array , slices in go. both mutable. slices natural choice of time.
  3. premature optimization... , of course: reading processing go 1 loop , wrapping input in bufio.reader idea.
  4. goroutines nice, allow different type of program construction, not mean used everything. there no need complicate clear loop introducing goroutines.

Comments