How to Mock google Bigquery in Golang?

By yuseferi, 2 January, 2023

if you want to deal with google clould clients directly in Golang, is most common option for you.
it has several different clients that give this possibility to work with most of the google cloud services.

we used this package in one of our project, particulary Bigquery client.  the documentation is good.

This is  a sample: 

func main() {
	projectID := os.Getenv("GOOGLE_CLOUD_PROJECT")
	if projectID == "" {
		fmt.Println("GOOGLE_CLOUD_PROJECT environment variable must be set.")

	// [START bigquery_simple_app_client]
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		log.Fatalf("bigquery.NewClient: %v", err)
	defer client.Close()
	// [END bigquery_simple_app_client]

	rows, err := query(ctx, client)
	if err != nil {
	if err := printResults(os.Stdout, rows); err != nil {

// query returns a row iterator suitable for reading query results.
func query(ctx context.Context, client *bigquery.Client) (*bigquery.RowIterator, error) {

	// [START bigquery_simple_app_query]
	query := client.Query(
				CAST(id as STRING)) as url,
		FROM ` + "`bigquery-public-data.stackoverflow.posts_questions`" + `
		WHERE tags like '%google-bigquery%'
		ORDER BY view_count DESC
		LIMIT 10;`)
	return query.Read(ctx)
	// [END bigquery_simple_app_query]

// [START bigquery_simple_app_print]
type StackOverflowRow struct {
	URL       string `bigquery:"url"`
	ViewCount int64  `bigquery:"view_count"`

// printResults prints results from a query to the Stack Overflow public dataset.
func printResults(w io.Writer, iter *bigquery.RowIterator) error {
	for {
		var row StackOverflowRow
		err := iter.Next(&row)
		if err == iterator.Done {
			return nil
		if err != nil {
			return fmt.Errorf("error iterating through results: %w", err)

		fmt.Fprintf(w, "url: %s views: %d\n", row.URL, row.ViewCount)

it sounds pretty straight-froward.but there was big problem, it was Testing !!!
one of the solutions which is suggested to test is create a test project on google and test your suffs there but it's not really a good option. another options is mocking Google bigquery, but how, it's a bit hard because it provides only a client struct ( wihtout interface)  with several reciever which are located in different files.

// Client may be used to perform BigQuery operations.
type Client struct {
	// Location, if set, will be used as the default location for all subsequent
	// dataset creation and job operations. A location specified directly in one of
	// those operations will override this value.
	Location string

	projectID string
	bqs       *bq.Service

it needs lots of code if you want to create a wrapper the bigquery client struct, I though maybe somebody did it before and after I searched for that I arrived, and particulary this adapter sounds nice. Unfortunately it is a archived repository and its a bit ourdate with the current google go client functionalities ( like parameter part) . btw, it wasn't a good option I needed to search for another option and Finally I arrived to bigquery-emulator which helped me to easily create a bigquery mock service and use it for testing our bigquery dependencies. 

all you need is create this function to create a bigquery mock service  for you.

func MockBigQuery(projectName string, sources ...server.Source) (client *bigquery.Client, err error) {
	ctx := context.Background()
	bqServer, err := server.New(server.MemoryStorage)
	if err != nil {
		return nil, err
	if err := bqServer.Load(sources...); err != nil {
		return nil, err
	testServer := bqServer.TestServer()
	client, err = bigquery.NewClient(
	if err != nil {
		return nil, err

	return client, nil

Then when you want to use it, there are several solutions to use it, if you want to load data from a Yaml file, pass source as a Yaml loader, something like 

projectName := "test"
bqClient, err := bigqueryMock.MockBigQuery(projectName, server.YAMLSource(filepath.Join("testdata", "bigquery_fixture.yaml")))

it loads dataset, table name, fields from the Yaml file and you don't need to create the dataset(s), table(s) an define th eschemas.

another apporach to source it is passing as a datasource difination, for example

	source := server.StructSource(
						types.NewColumn("id", types.INTEGER),
						types.NewColumn("field_a", types.BOOL),
						types.NewColumn("field_b", types.BOOL),
						types.NewColumn("field_c", types.STRING),
						types.NewColumn("field_e", types.STRING),
						types.NewColumn("field_f", types.STRING),
							"id":             1,
							"field_a": false,
							"field_b":  false,
							"field_c":      "ok",
							"field_d":  "yuseferi",
							"field_e":   "A",
						}, {
							"id":             3,
							"field_a": false,
							"field_b":  true,
							"field_c":      "nok",
							"field_d":  "golang",
							"field_e":   "A",
	client, err := bigqueryMock.MockBigQuery(projectName, source)

then what you need is just use this client as mock of bigquery client in your  unittests and integrational test.
feel free to share your thoughts with me.