Thursday, June 17, 2021

NASA-APOD Wallpaper

 Things on the Internet are too cool sometimes! 

NASA APOD (Astronomy Picture of the Day) is a very popular API that provides data in JSON format with URLs, Details and much more. Here, I have tried to set that APOD High-Quality image as my Wallpaper for my Desktop.

It is to be noted that this works with GNOMEs and I am not sure with other environments. Maybe a little tweaks would help it.

How is it done?

 1. NASA-APOD url is requested and the content from the response is obtained

2. JSON library in python is used to convert it into dictionary form to make it easier to work with.

3. If the APOD is a Image-type Media, then, It is downloaded and saved in a directory

4. This image is set as the wallpaper using few OS commands

CODE CAN BE FOUND HERE

Monday, June 14, 2021

Simulating Fourier Transform

 Using Julia and Python

Discrete Fourier Transform:  

It is a method to converting a signal from Time Domain to Frequency Domain
More the details can be read here
We have used the Fast Fourier Transform, as it is was the primary purpose of the project. The algorithm is implemented using Python (which can be found in the same repository). This module is called in the Julia console, and used.

Now when FFT is applied in Real-time, we take the sample and concatenate it to the last of the set of N-Points (here 64). Then, the Fourier Transform of the whole set is obtained and displayed in Real-time. The function below, does the same. It cuts first (oldest sample) and makes the list back to the standard size of DFT (here 64-point)

PyCall and Plots:

As mentioned above, the FFT calculation is taken care by python class which can be found here (or in my github repository mentioned at the end). Rest of it can be handled by using Julia, as it is easier and faster to generate animations in Julia when compared to Python. Moreover, Julia also gives us a privilege to call Python libraries. 

It is to be noted that the FFT.py class that we have prepared is not added to PATH and is also not a inbuilt function to be used, hence we have push the class/file into the PATH and then try importing it. The way to do it is shown in the code and is almost self explainatory.

Once we have our fft() method ready, we then try to initialize an array of 64 elements with zeros and at every loop, we pass a value into the calculator. This would give us the output in for of an array of 64 - complex elements. Now, we have two components that can be inferred, and I have chosen to plot the magnitude of the spectrum, hence the absolute value - abs().

We try to provide a random integer between 0-3 to the system and this is what we tend to generate. It can be seen that, the FFT calculator, almost instantaneously, produce the equivalent fourier transform result and this can be visualized as shown.


 

Moving Average Filter:

 A signal is generated of time duration 2 seconds, with sampling frequency 1kHz. 2 signals of 12f (Hz) and 8f (Hz) where f=10, are superposed. This Signal is introduced to a noise of a very small energy, whose Fourier Transform is demonstrated.This set of data given to the FFT calculator sample by sample and simulataneously, the FFT of the set is calculated. However, for the first few samples, the set is initially zero-padded to make it to 64-point, hence it takes some while to get to a stable situation. It can be seen that there are spikes at frequency 80Hz and 120Hz, which clearly tells us the presence of signal of that frequency in the input signal and the amplitude to the spike corresponds to the proportional contribution of that component to the input signal. Since input keeps changing at every instant, the equivalent Fourier transform is obtained at every instance and it keeps varing too.


Due to the noise in the input, It becomes very difficult to comprehend the Fourier Transform. This can be reduced, by reducing the noise in the input signal. This can be done by using Moving Average Filter. This concept helps to reduce the noise, by averaging out the sample with the preceding inputs which reduced the small abrupt changes caused due to noise. More on Moving Average Filters can be seen here

In the example shown, we have chosen the MAF to be of 5-points.
We can notice that the noise is considerably reduced, which makes the Fourier Transform more stable and easy to comprehend but the catch is that, The amplitude of the signal reduces.

Github repo

Terraform - Modules

 T of Terraform:

Terraform is a toll that helps us in building, changing and versioning in the resources of an infrastructure. Not necessarily though, because, Terraform an execute configuration files and request a API to bring the state of the infrastructure or that entity close to that as desired in configuration file (as much as possible). Terraform can serve multiple platforms including AWS, Azure, GCP, IBM-cloud, Alibaba cloud, VMware. To makes things interesting, these state dependent programs can handle any platform which serves requests from API call, when the providers are written for that platform. 

Talking about GCP, we can use existing provider, already written and just write down the configuration file and request API to achieve the state as provided in the configuration file. It has to be written in HashiCorp Configuration Language (HCL), wherein the descriptions of resources using blocks, arguments and expressions are provided.

Little into depth:

It is always better to organize things in such a way that they are easy to comprehend, well structured and reusable in many cases, but also not so vague to make the user rewrite a number of parameters every time they use it. Modules serve the purpose. A module can be created, stored and called whenever required. This is more like using classes and functions in other OOPs based languages. 

They can be organized in general style or any convenient method as preferred by the user. An example is shown, wherein the modules are stored under a single directory, categorized based in the resources they refer or work with. 

The environments such as global, production etc.. are kept separately and the modules are called and used whenever required, hence organizing it systematically and also using comfortably!

Example!

Here, I have tried to create couple of modules for resources and a code to use those modules to accomplish the task. This below is a module file of creating an instance. We can declare all the variables in the same file or make a separate file for that, Terraform reads all the files in the folder with the extension .tf  .

https://gist.github.com/group4pgs/1cf6a514c0ab8d78bae18bad20451b57


This file is saved in a location and the address is noted down, this helps in locating the module for the instance creation. We call this module in the way, shown below. One must make sure that every variable required by the module is addressed correctly.


https://gist.github.com/group4pgs/3ca92c45ec7094bcc0e386e767c203a7

 

Concluding things

Just one of the few functionalities that Terraform provides which are of great use. One can check the Official website of Terraform which already has many modules, preconfigured, you can directly refer to the github repo, and make use of the publicly available modules to build their own resources in an infrastructure. 

Tuesday, June 8, 2021

From Excel to GCP

 Abstract:

There are a number of ways to automate things and make the work flow easier. For instance, to create a connected infrastructure, one had to manually create every resource involved and had to interlink them manually, but with the introduction of APIs, Infrastructure is being defined by a set of codes, hence the name Infrastructure as a Code (IaaC). With the help of enabled API, one can achieve such tasks by a language that he is comfortable with. Many such options include - Terraform, Ansible and many more. Here, we try to get resources and their specifications from user in an excel file (extension .xlsx) and create resources in Google Cloud Platform (GCP) with the help of Google's API client for Python. A well defined Excel file is created with sheets corresponding to some kind of resources such as VPCs, Subnets, Compute Instances, Firewall rules etc.. Such methods allows users to create a program and automate the working of the system in a very user friendly manner.


 

 GCP API client for Python:

APIs are kind-of interfaces that makes communication between two entities possible, may that be software applications or mixed hardware-software applications. There are two major ways to invoke a API when it comes to GCP - via HTTP requests and REST (Representational State Transfer). All the API client libraries makes use of such REST style of architecture. In such systems, when a client makes a request to perform some action, it is often sent in a representational form and the response is also returned after performing the task. Client must include the final desired state of the resource and the API takes care of the rest when requested and authenticated. 

All the resources are primarily stored in a data store, which are allotted to respective user or organization when a request is successfully made. The details of the final state of the system is embedded in the request and the API creates resources respectively. For instance, if one had to list all the instances in a particular zone of the project, A request has to be made to the API through client. The request must include the Project_ID and Zone. Regarding how it is done, we shall see later. For more information on API clients, you can click here.

 Compute Instance API

Google API client has a category that handles REST API requests regarding Instances, Firewalls, VPCs, Subnets, Instance groups, Images, Health checks and many more under a single class of compute. This can be fetched by referring to googleapiclient/discovery/build. Once this is fetched, one can insert, list or delete the listed features. 

For instance, if we want a Instance of certain specifications, then the REST request must comprise of few things, such as, name, machine_type, sourceImage, network, subnetwork, accessConfigs, metadata, tags etc..  

 {     'name':name,
        'machineType':machine_type,
        'disks': [
            { 'boot':True, 'autoDelete':True, 'initializeParams':                     {'sourceImage':disk_image,} }
            ],
        'networkInterfaces': [{
                    'network': vpcLinks[network],
                    'subnetwork':subnetLinks[subnet],
                    'accessConfigs': [
                        {'type': 'ONE_TO_ONE_NAT', 'name': 'External NAT'}
                ]    
                }],
               'metadata': {
                   'items': [{ 'key':'startup_script',
'value':open(startup).read()}],
               },
               'tags':{
                   'items':networkTags}}

In python, once these are specified in the form of a dictionary and this dictionary is fed to the compute.instances().insert() with projectID and zone. 

Handling Pandas:

Now when the task to connect to the GCP is done, extracting data from the Excel sheets is done using Pandas. It reads the Excel file (sheets) and creates a data-frame with the key name equivalent to the column name. we can refer to the whole column as a list by referring to the key name, like, dataframe[column] 

Every sheet of the file contains details about the resource that needs to be created, Pandas reads it and makes it a dataframe. A code is developed to segreagte and sort the things out into multiple dictionaries. These are then sent as requests to the API via the API Client and the resources are created.

Code can be found here

Scope:

It is to be noted that this was just demonstrating an example to create certain resources, but API Clients are capable doing much more than just this. This technique can be used to list out, create and even destroy destroy resources. Python being such a upcoming language, gives us more discretion to use this power to our use and customize things as we want.