Smoke Your Server Using Goroutines

Written by mohammed-gadiwala | Published 2019/11/02
Tech Story Tags: golang | concurrency | goroutines | testing | golang-application | multithreading | golang-api | latest-tech-stories

TLDR The power of goroutines can be easily achieved by leveraging the power of Goroutines. Using concurrent-http library we send requests at different levels of concurrency. This library does not simply allocate equal requests to each thread. Instead each thread picks up a request from the queue and ensures that the load is divided almost equally. As we increase the concurrency we get diminishing returns. Sending more requests would require more clients and an efficient server. Sending requests with a concurrency of 1.3sec as compared to 8.5mins that would have taken in a serial order.via the TL;DR App

We all want to test our servers and the latency induced by scale. There are different ways to do that one way would be to use postman to send multiple requests. But how do I send concurrent requests? Say I want to a million requests with different levels of concurrency. This can be easily achieved by leveraging the power of Goroutines.
I will be using my simple micro-library concurrent-http to attain this.
First I create a mock HTTP webserver to log the requests. This will be your web server you want to make requests to. This server sleeps for 500ms to mock a real server processing and sends an HTTP status 200.
package main

import (
	"net/http"
	"time"
)

var count int64

func handle(w http.ResponseWriter, r *http.Request) {
	time.Sleep(500 * time.Millisecond)
	w.WriteHeader(200)
	r.Body.Close()
	count++
}

func main() {
	http.HandleFunc("/", handle)
	if err := http.ListenAndServe(":8080", nil); err != nil {
		panic(err)
	}
}
Now let’s get to the good part. We start sending HTTP requests using the concurrent-http library at different levels of concurrency. This library does not simply allocate equal requests to each thread. Each thread picks up a request from the queue and ensures that the load is divided almost equally. The main logic resides in the below snippet.
A Waitgroup is used to ensure all goroutines have been completed and all requests have been sent.
Mutex is used to access the counter variable in a thread-safe manner.
package main

import (
	"concurrent"
	"fmt"
	"net/http"
	"time"
)

func main() {
	url := "http://localhost:8080/"
	httpRequest, _ := http.NewRequest("GET", url, nil)

	// Parallelism of the request
	concurrency := 1000

	// Total number of requests to be made.
	numberOfRequests := int64(10000)
	concurrentRequest := concurrent.NewRequest(httpRequest, numberOfRequests, concurrency)

	startTime := time.Now()
	go func() {
		concurrentRequest.MakeSync()
		completetionTime := time.Now().Sub(startTime)
		fmt.Printf("%v time required to complete all requests", completetionTime)
	}()

	tick := time.NewTicker(500 * time.Millisecond)
	for range tick.C {
		status := concurrentRequest.Status()
		timeElapsed := time.Now().Sub(startTime)
		fmt.Printf("%f% requests sent, Time elapsed: %v", status, timeElapsed)
	}
}
We start sending requests and start a ticker to poll the number of requests completed every second. The following are the results obtained.
1. Sending 10 requests with a concurrency of 1. That is all the requests are in a serial manner. We can observe it takes around 5sec to send 10 requests. It would approximately take 8.5 mins to send 1000 requests in a serial manner
2. We use the power of concurrency. We send out those 1000 requests with a concurrency of 500. We see that it’s reduced to a mere 2.3sec as compared to 8.5mins that would have taken in a serial order.
3. Now we go a little higher with which you can smoke. We send 200,000 requests concurrency of 100,000. We see that it doesn’t scale linearly and the server slows down a bit. Sending more requests would require more clients and an efficient server.
Conclusion
Hence we see the power of goroutines and it’s applications to smoke your HTTP servers easily. We also see that as we increase the concurrency we get diminishing returns.

Written by mohammed-gadiwala | Backend engineer | Golang
Published by HackerNoon on 2019/11/02