Simple approach for Learning C# 8.0 and Core 3.0

The book has been structured very beautifully taking you Step-By-Step from One Level to another.

It Starts from following

1.Setting up the development environment

2.Know about the .NET

3. Basic concept of the all essential tools and syntax needed

4.Understanding GIT

5.Understanding Versions of of C# with grammar and Vocabulary

6.How to Debugging and creating Test cases and doing configurations

7. Packaging and deployment details

8. various serialization technique

9. DB connectivity and use of Entity Framework Core

10.Linq and lambda

11.Performance improvement


13.Xamarine and other mobile services

This books covers the basic concepts with explanation  about the backend process of execution which we rarely see in any book now a days.

If you are looking to learn concepts of C# and core 3.0 with how to use with different Microsoft modules this would be the best option to go for

Nice Book Mark

Deep Learning A Subset

In the Present software Era we  are more focused on providing the meaning full information to the customers/Partners  based on the data they have gathered since the past decades.

The Data can be in the form anything say it’s a Raw data in excel or a structured data in Sql server/Oracle or any of the unstructured Databases. Over the decade the core sectors have sufficient data which can help them in  planning for the upcoming or say future market condition and they can plan for the  best and worst condition.

With the  Deep learning  the accuracy of the prediction has increased tremendously and sometime it exceeds human outcome as well. All the latest invention like Driverless cars Auto mode cars, Image recognizing form millions of images are achieved using deep learning .

With this saying we are in the Era of Prediction with perfection and Accuracy.

Bellow diagram show case the three layers or the sets which talks about the AI/ML/DL

Relationship diagram: AI vs. machine learning vs. deep learning


Deep learning work on the concepts of artificial neural networks. The Learning/Training process is deep  because of the structure of the artificial neural networks. This structure consists multiple layers of the   following

  1. Input
  2. Output
  3. Hidden layers

Every layers contains units that process the raw input data into processed meaningful information which is used by other layers for certain predictive tasks . This task helps the machine to learn through its own data processing.

More the layers are defined the accuracy is high  but while designing we need to be sure what the output we are looking for based on that the structure needs to be defined. Here we get the multiple outputs with different accuracy levels to choose from and the prediction improves with each process.

There are various use cases available to understand the deep learning below are some of the examples

  1. Object detection
  2. Image caption generation
  3. Machine translation
  4. Text analytics

With the Help of Deep Learning Models we can achieve whatever we can think of .

Let’s take an simple example of our day to day learning. We all use mobile phone and click pictures out of it. Then we do the editing of the picture wherein we get the option of picture like

  1. Filter
  2. Portrait light
  3. Beautify

This all are the done with the Deep learning where in the software understands the image and provide you the various options to use while editing the image.

Another example can be of Google lens we take a picture and post it and it finds the similar images and  details of it .

In short we can say we have gone deep into the Deep-Learning and everywhere we can see the use of it.

Azure SQL-Hyperscale service tier

The Hyperscale service tier in Azure SQL Database is the newest service tier in the vCore-based purchasing model. This service tier is a highly scalable storage and compute performance tier that leverages the Azure architecture to scale out the storage and compute resources for an Azure SQL Database substantially beyond the limits available for the General Purpose and Business Critical service tiers.

The Hyperscale service tier in Azure SQL Database provides the following additional capabilities:

  • Support for up to 100 TB of database size
  • Nearly instantaneous database backups (based on file snapshots stored in Azure Blob storage) regardless of size with no IO impact on compute resources
  • Fast database restores (based on file snapshots) in minutes rather than hours or days (not a size of data operation)
  • Higher overall performance due to higher log throughput and faster transaction commit times regardless of data volumes
  • Rapid scale out – you can provision one or more read-only nodes for offloading your read workload and for use as hot-standbys
  • Rapid Scale up – you can, in constant time, scale up your compute resources to accommodate heavy workloads as and when needed, and then scale the compute resources back down when not needed.

Who should consider the Hyperscale service tier

The Hyperscale service tier is primarily intended for customers who have large databases either on-premises and want to modernize their applications by moving to the cloud or for customers who are already in the cloud and are limited by the maximum database size restrictions (1-4 TB). It is also intended for customers who seek high performance and high scalability for storage and compute.

The Hyperscale service tier supports all SQL Server workloads, but it is primarily optimized for OLTP. The Hyperscale service tier also supports hybrid and analytical (data mart) workloads.


Robotic Process Automation-Illusion

After listening to the word RPA, the general Myth is that Robot are going to take the job of individuals. The same Myth was he there when the Computer industry wave came. but we have seen the reverse of it more people got job in the industry.

There is nothing to be afraid of as RPA is going to automate those reoccurring process that Human does. Which we call low end work but this will help people to skill up themselves so that they can get better job and good PayScale to spend their livelihood

RPA cannot replace human. It helps is doing the repetitive task. Just taking a general scenario of invoice verification.

Accountant receives the invoice for processing and the first step he does is to verify the correctness of invoice like

  1. Invoice No is present
  2. Date is valid
  3. Company name is properly written
  4. PO number is present
  5. Invoice Id is present

For a 1 invoice verification he would need couple of minutes and we see company receives 1000 invoices so if this is automated using RPA and he only receives the verified invoice so for him those time are saved and he can focus on another task.

similarly, there can be may examples of process automation in different industry.

so, in short, we should not take this as a job snatcher. It is helping us to boost up our skills to go to next level


Hyperledger can be thaught of as a software which everyone can use to create one’s own personalised blockchain service.

On the Hyperledger network,only parties directly affiliated with the deal are updated on the ledger and notified .Thus maintaining privacy and confidentiality.


Committer: Responsible to append validated transactions to their specific ledger

Endorser: Responsible for Simulating transaction and preventing unstable and non deterministic transactions

Consenter: Responsible for Network consenuse service and A collection of consensus service nodes (CSNs) will order transactions into block according to the network’s chosen ordering implementation