Skip to main content

Posts

Operations of Multipart Upload in S3

MULTIPART UPLOAD – INTRODUCTION Multipart upload is one of the most important operations in S3 protocol which is very efficient to upload the parts of the files separately and let the S3 combine those parts to a single object. Many S3 supported software, such as S3Browser can handle such situations when there is a need to upload of a large file, which is usually more than 10MB. In such situations, the software splits the file to into many parts, and upload these parts according to the configured setting for simultaneous uploads. Until now, we are only thinking about uploading the parts, but why we are not uploading the file at once instead of getting into a complex operations of multipart upload in s3. MULTIPART UPLOAD – LOGIC There is a very simple logic in this. Reduce the risk of the upload process. Keeping the upload long can become risky and open to encounter timeout issues. If you would like to upload a file, for instance, an ISO file around 4GB, it might be possible that a timeo...

Cursor Highlighting in Vim

Most of the time I am using   vim   editor when I need to edit and modify files. In the beginning it looks strange for the people working on Windows since classical Notepad application is much more simple and the graphical user interface provides more simplicity. Having several modes in vim it is making things a little difficult to understand but the strength of the editor stays behinds these modes. Of course we will touch to the cursor highlighting in vim a little later. What we usually start to learn from using vim is to open a file then press i to enable  INSERT  mode, make some changes, and  ESC  to enable  escape mode  and type  :wq  to save and exit or  :q!  to quit without saving. As we all know this is the most basic thing we can perform with vim. Of course within the  ESCAPE  mode, there are many more things that can be done such as replacing strings, adding line numbers, define number of spaces of an indent ...

Passing Variables to Awk

 I ntroduction to Passing Variables to AWK AWK  is one of the most famous commands to use and manipulate the string based files which can be parsed into fields using delimiters and the functionality can even extended with passing variables to awk. It is so much convenient to work with AWK due to its easiness and understandable scripting to accomplish some certain tasks such as filtering fields. The special parameter  -v  is used to pass bash variables or variables inside a script . The general syntax will look like the following. The most important thing is that for each variable that needs to be passed into awk script, they all need to be passed with repetitive -v use.  awk -v var_name=value [-v var_name2=value2] 'AWK script' input_file In a sample input.txt file, say the content looks like as the following. If delimited with space, first column will be the count, second field is the user agent and third field is the agent version. $ cat input.txt 30 okhttp 5.0...

Using Aws Cli for S3 Operations

  This article will be about some basic usages of awscli for S3 operations First we need to create a profile, this will be basically storing the information about our S3 connection. $ aws configure AWS Access Key ID []: <this will be Access Key Id> AWS Secret Access Key []: <this will be the Secret Key> Default region name []: <This will be the region name> Default output format []: <This will be the output format of each aws command> Listing the buckets $ aws s3 ls --endpoint-url <endpoint url> 2023-06-07 16:37:25 bucket1 2023-06-14 13:50:00 bucket2 List objects and prefixes in the bucket $ aws s3 ls --endpoint-url <endpoint url> bucket1 PRE folder1 2023-06-15 10:32:40 101713608 object1 Get ObjectLock configuration for a bucket $ aws s3api get-object-lock-configuration --bucket bucket1 { "ObjectLockConfiguration": { "ObjectLockEnabled": "Enabled", "Rule": { "DefaultRetention": { ...

How do I perform Patch Installation on My Powerscale Cluster?

I have been using Isilon simulator for a long time to test some features and prepare myself to some projects. Knowing that this is a home environment, I am also performing patch installations when I have time. Lets first power on the Powerscale nodes on my Vmware Workstation. We will need to wait some time for all nodes to totally up and running. Lets check if we are able to login from the cli and run "isi status" to check the cluster status. Next, we will need to download the Rollup Patch from the Dell Technologies Support page. After downloading we will need to extract the file. You will see that the contents of the file similar to below; here we will need to upload the file with the pkg extension to the /ifs/data/Isilon_Support/ .  I am using FileZilla to perform the upload I can verify that the file is there. Here we are ready to Perform  the code upgrade. This part really important since we are here only performing the code upgrade, in a real work scenario, you will nee...

Virtual Machine Provisioning Automation at Home Trials and Fails - 1

For a long time, I have been using a home lab environment for many things. The way, I am doing is that, cloning from my template virtual machines to provision new ones, and change settings manually. This is not rocket science, however I wanted to ease and automate one of my basic processes. Using Vmware Workstation Pro 16 in my environment, I noticed that there is a Rest Api provided within the software, and noticed that there is no option for creating a virtual machine from the scratch however, there is an option to clone a VM to another, which I was surprised to see but liked a lot. While further digging, I noticed that I can use Terraform to do such provisioning using Vmware Workstation provider , which I was thinking to use. I was super excited to use Terraform for the first time in my life, so I cloned a new Virtual Machine, set its IP Address, create the DNS records as I described in my blog article  etc. I checked Terraform deployment article, prepare the prerequisites and i...