Showing posts from 2020

Uploading artifacts too large archive - Gitlab pipeline

  FATAL: too large  gitlab pipeline 😎 ERROR: Uploading artifacts as "archive" to coordinator... too large archive Today I have updated some node version in my gitlab ci pipeline, After upgrading the version I ran the pipeline and I got an error that ERROR: Uploading artifacts as "archive" to coordinator... too large archive. 🔻 node_modules/: found 57656 matching files and directories ERROR: Uploading artifacts as "archive" to coordinator... too large archive  id=31845 responseStatus=413 Payload Too Large status=413 token=YGCc9B7z FATAL: too large                                   Cleaning up file based variables 00:00 ERROR: Job failed: command terminated with exit code 1 I spent some time debugging this error and finally I fixed this error. Issue was related to the maximum artifact size limit. In my Gitlab maximum artifacts limit was set to 100MB which is the default artifact upload size limit. Here are the steps to resolve this error. In your Gitlab

Secure nginx with Let's Encrypt on Ubuntu 18.04

Image source Configure Let's Encrypt SSL for Nginx webserver on Ubuntu 18.04. Let's Encrypt is a CA authority that provides free SSL certificates. You can get a certificate for web server like apache and Nginx . In this tutorial, I will explain you how to obtain ssl certificate using Certbot in the ubuntu server and make your website more secure. We will use Nginx web server in this tutorial. Prerequisites Setup Ubuntu 18.04 server with running Nginx webserver. Domain name, In this tutorial I will use  &  domain. You can buy a free domain on . Install certbot. Certbot is an EFF tool that obtains a certificate from Let's Encrypt and Automatically enable HTTPS on your website. So let's install Certbot on Ubuntu 18.04. We assume that you have installed Nginx before. Add the repository : add-apt-repository ppa:certbot/certbot This is the PPA for packages prepa

Manage GCP bucket using gsutil command

How to create a GCP storage bucket using CLI. Image credit `wikipedia` & Google cloud In this tutorial, I will explain to you how to manage and perform basic tasks in GCP cloud storage using the command line. Cloud storage is worldwide and highly durable object-based storage. You can access data instantly at any time from storage class. To manage cloud storage using command line we need to configure  gsutils Install   gsutil   on Ubuntu 18.04. Using  gsutil  you can manage GCP cloud storage. You can perform tasks related to cloud storage using  gsutil  command. So let's install  gsutil  on ubuntu. There are different methods to install  gsutil  But here I will install it using python package Index pip. Install the required packages. sudo apt-get install gcc python-dev python-setuptools libffi-dev Install the pip installer. sudo apt-get install python-pip Install  gsutil  using pip. sudo pip install gsutil Now, You are ready to use  gsutil . OR

Nginx Ingress with Cert-Manager Kubernetes

We will use Helm to install Cert Manager to our Cluster. Cert-Manager is a Kubernetes native certificate manager. One of the most significant features that Cert-Manager provides is its ability to automatically provision TLS certificates. Based on the annotations in a Kubernetes ingress resource, the cert-manager will talk to Let’s Encrypt and acquire a certificate on your service’s behalf. Note  : Ensure that you are using Helm v2.12.1 or later. Prerequisites  : A Kubernetes cluster version 1.8+ The kubectl CLI installed and configured Helm and Tiller should be installed. 1. Connect the cluster : gcloud container clusters get-credentials yourclustername --zone zonename --project projectname 2. Create a namespace  cert-manager . Before installing cert-manager . We will create a namespace for the cert-manager.  kubectl create namespace cert-manager 3. Install Cert-manager Now, Install the cert-manager and CRDS. it will install the issuer and cluster

Publish your Docker Image to Docker Hub

Image source Create and publish your docker image on docker hub. Here in this note, I will tell you to how to upload your custom image to dokcer hub and make it public. You can upload your custom images to docker hub. So other people can also use that image in their applications. Docker Hub is the world's easiest way to create, manage, and deliver your teams' container applications. You can find many images on docker hub for your application. Also you can upload your custom image on it. Login to the docker hub. First you need to login to the docker hub. Run the below command from your terminal to login. docker login It will ask you for enter usename and password. Login with your Docker ID to push and pull images from Docker Hub. If you don't have a Docker ID, head over to to create one. Username: imvishalvyas Password: ************ Login Succeeded We now successfully authenticated docker hub login. So now you ca