Afraid i'm not nearly competent enough yet to try contribute but would love to someday maybe once i learn a lot more Go. In the meantime perhaps this can serve as a sort of feature request people can vote on, although i'm sure i'm probably not the first person to think of or maybe ask for this so apologies if i'm just making noise here.
In terms of a reproducible example of where i'm coming from:
package main
import (
"fmt"
"io/ioutil"
"net/http"
"regexp"
"strings"
"sync"
"github.com/go-gota/gota/dataframe"
)
// Create a wait group
var wg sync.WaitGroup
// Get api response (expects format=csv) and make a dataframe from it
func getDf(url string, c chan dataframe.DataFrame) {
// Need to make sure we tell wait group we done
defer wg.Done()
// Pull chart name from the url
re := regexp.MustCompile("chart=(.*?)&")
match := re.FindStringSubmatch(url)
chart := match[1]
resp, _ := http.Get(url)
// Get body as string for ReadCSV
bodyBytes, _ := ioutil.ReadAll(resp.Body)
bodyString := string(bodyBytes)
df := dataframe.ReadCSV(strings.NewReader(bodyString))
// Add chart suffix to each col name
// (ignore first col which should be "time" and used for joins later)
colnames := df.Names()
for i, colname := range colnames {
if i != 0 {
df = df.Rename(chart+"|"+colname, colname)
}
}
// send df to channel
c <- df
}
func main() {
// Define a list of api calls we want data from
// In this example we have an api call for each chart data we want in our df
urls := []string{
"https://london.my-netdata.io/api/v1/data?chart=system.cpu&format=csv&after=-10",
"https://london.my-netdata.io/api/v1/data?chart=system.net&format=csv&after=-10",
"https://london.my-netdata.io/api/v1/data?chart=system.load&format=csv&after=-10",
"https://london.my-netdata.io/api/v1/data?chart=system.io&format=csv&after=-10",
}
// Create a channel of dataframes the size of number of api calls we need to make
dfChannel := make(chan dataframe.DataFrame, len(urls))
// Create empty df we will outer join into from the df channel later
df := dataframe.ReadJSON(strings.NewReader(`[{"time":"1900-01-01 00:00:01"}]`))
// Kick off a go routine for each url
for _, url := range urls {
wg.Add(1)
go getDf(url, dfChannel)
}
// Handle synchronization of channel
wg.Wait()
close(dfChannel)
// Pull each df from the channel and outer join onto our original empty df
for dfTmp := range dfChannel {
df = df.OuterJoin(dfTmp, "time")
}
// Sort based on time
df = df.Arrange(dataframe.Sort("time"))
// Print df
fmt.Println(df, 10, 5)
// Describe df
//fmt.Println(df.Describe())
}
The df i end up with above is outer join of lots of df's which may have different frequencies in time e.g. some have data every 5 seconds and some every 1 second. So i'd love to just ffill() all the NaN values to the last known value.
I'm guessing maybe this could be done with a custom function using Capply maybe. (?)
I am new to Go and Gota and wondering if there is anything implemented yet that i could try leverage to forward fill my dataframe similar to pandas ffill: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.ffill.html
Afraid i'm not nearly competent enough yet to try contribute but would love to someday maybe once i learn a lot more Go. In the meantime perhaps this can serve as a sort of feature request people can vote on, although i'm sure i'm probably not the first person to think of or maybe ask for this so apologies if i'm just making noise here.
In terms of a reproducible example of where i'm coming from:
The
df
i end up with above is outer join of lots of df's which may have different frequencies intime
e.g. some have data every 5 seconds and some every 1 second. So i'd love to just ffill() all the NaN values to the last known value.I'm guessing maybe this could be done with a custom function using Capply maybe. (?)