Sunday, December 15, 2013

Scientific Computing: An Intro

Scientific Computing is nothing but mathematical and informatical basis of numerical simulation. It can be used for reconstruction or prediction of phenomena and processes, esp. from science and engineering, on supercomputers
It is often known as the third-way to obtain knowledge apart from theory and experiment. It is 

transdisciplinary: mathematics + informatics + field of application.

Objectives may include,
  • Reconstruct and understand known scenarios (natural disasters)
  • Optimize known scenarios (technical processes)
  • Predict unknown scenarios (like the weather)

One would wonder why would we need Numerical Analysis ? Well, there can be many possible reasons for it,

1. Since experiments are sometimes impossible like,
      - Predicting the life cycle of galaxies
galaxy

      - Weather forecast
      - Predicting stock market, or predicting           economic effects

2. Since experiments can be unwelcome sometimes, these would include
      - Tests of nuclear weapons
      - Stability of buildings
stability test

      - propagation of harmful substances

3. Sometimes experiments can be costly
      - Car crash
Crash test

      - Aerodynamics
      - Analysis & study of proteins

What's interesting is that you have people master one particular tool, and then work on that to solve some complex problem  of their discipline. Let's look at some particularly famous tools that most of the researchers use for this.

Mathematica logo.

Mathematica is a computational software used in many disciplines such as scientific, engineering etc, developed by Wolfram Research. It has all the features that MATLAB includes but can also be extends to 2D, 3D processing, parallel           programming.


Matlab logo.

MATLAB short for MATrix LABoratory is a numerical computing environmnt, sometimes also called fourth-generation programming language. It is allows plotting of functions and data, implementation of
algorithm, creation of UI, and interfacing with other languages, including C, C++, Java and Fortran.


With all of the technological innovation happening today, this field of computation will only be more  helpful as complex problems become much more complex to solve. It will be great to see what all problems can be solved with the ongoing technological advancement.


Monday, December 9, 2013

Computer Graphics: It can do Miracles!

Ever wondered how is it, that we still get the same graphics as we used to get a few years back on the gaming console, now on our mobile devices. Apple keeps increasing its display resolution every year and gives these stunning display quality.

In this blog I will be giving a brief intro about computer graphics and its applications. Will not be getting in too detail about anything but I encourage you to read the links provided. So coming back to how Apple manages to do that, well it's all about pixels and making it shine without drinking a lot of power. There are many way to do that and I will leave that for you to understand if you are interested in knowing the different techniques.
This link sums up pretty much everything neatly giving you a brief history into the past as well.

Some different types of images that we have seen in our day-to-day life which are the core essential of computer graphics are

Pixel-art
2D images are use in applications that were originally developed from traditional drawing and printing technologies.

Pixel-art
A large form of digital art being pixel art is crated through the use of raster graphics software which images are edited on pixel level



vector graphics
Vector graphics are complementary to raster graphics. It consists in encoding information about shapes & colors that comprise the image, which can allow for flexible rendering.



3D graphics compared with 2D are graphics that use 3 dimensional representation of data. Despite many differences they still use the same algorithms. 3D graphics are the same as 3D models.



Animated pic
Computer animation is the art of creating and moving images via the use of computers. To create the illusion of animation, an image is displayed then quickly replaced by a new image that is similar to the previous, but slightly shifted. This is somewhat similar to the illusion of movement in television and motion pictures. 



A list of the different styling techniques can be found at the following link along with their brief intro. A very good example to understand a few things about graphic designing we see in games and other animations is to see this video which shows how real-time planet rendering and lighting is done in OpenGL



One would always feel why do we need computer graphics ?

   The importance lies in its applications. In engineering applications like automotive and aerospace, the ability to quickly visualize newly developed shapes. Before the advent of computer graphics, designers built expensive prototypes and time-consuming clay models. Now, they do it interactively with the help of computers.

Medical imaging is another application where computer graphics has proven to be very valuable. Examples include 3D Xrays

Computer graphics has also expanded the boundaries for art and entertainment. Movies such as Jurassic Park, Avatar are good examples which make use of computer graphics to test the bounds of imagination.
Virtual reality which is fast becoming an indispensable tool in education. Flight simulators are used to train the pilot for extreme conditions and also used to train novice surgeons without endangering patients. These wouldn't have been possible without computer graphics.

 And as the industry develops today, one's imagination is the limit to things that can be developed. 

Saturday, November 30, 2013

Networking: How does the Internet work :/

Have you ever while surfing on Facebook, thought about how does the Internet actually work ? how am i able to speak with people from across the world within seconds ? or how is it that you give it a name and it fetches you the page so quickly ? We say that the computer understands everything in 0s and 1s..but then, how does the computer understand where to go when we type www.google.com 

The answer to all these questions is just 2 words Computer Networking. Networking is such a huge topic to talk about, it includes protocols, DNS, DHCP, topologies, different networking standards and so on (I'm sure you must be knowing about most of them I mentioned in the list). For me, it always was intriguing to know how the internet actually functions and to take that interest forward I learnt about different routing protocols and that is what I'll be talking about in this post.

Routing protocols are the most vital aspect of networking because, when a packet is sent out of your machine in the internet, it needs to select a route for that packet to go, making the correct and the shortest route available is the job of these routing protocol. Different protocols are available based on the network topologies, and are used by service providers, like RIP(Routing Information Protocol), OSPF (Open Shortest Path First), BGP (Border Gateway Protocol). Just remember one thing the core infrastructure of internet (These are a select few routers in the World which handle the routes from one county to another across the world, as of 2013 there are 6 tier 1 provides in the telecommunication industry. Level 3 communication, Century Link, Vodafone, Sprint, At&T, Verizon) has BGP running as the networking protocol..Since, it is a bit complex than other routing protocols I will not be getting into it.

Another mechanism that is worth knowing is MPLS (Multi Protocol Label Switching), this is the most important forwarding mechanism that is used along with IP..Service Providers are changing their infrastructure to include MPLS...with MPLS Traffic Engineering capabilities it makes it much more profitable to use MPLS.

Since routing capabilities are needed by all devices today given that it is required by all devices to connect to the internet or some other network for communication...we have to include routing capabilities in devices that one cannot even think about from mirrors to poles alongside the road...

One such protocol I came across recently was DFF (Depth First Forwarding)..This protocol was proposed to be used on low-power devices where the network topology changes frequently. You might think, why do we need more protocols if we are already using the ones mentioned above, for years now..Well, for that you need to understand that when a routing protocol runs on a system it uses a lot of power and CPU cycles to keep the routes updated. In low powered devices due to the power constraints we need another protocol based on this specification..

One such protocol used or this use-case is DFF and the good thing about this protocol is, in conventional routing protocols if a packet does not reach a destination it drops the packet and resends it later, however, in this, before dropping the packet and declaring the neighbor as not reachable, it tries all the depth-first neighbors. This way it checks all the possible paths to reach a particular network instead of just one. Look at the figure below to get a better understanding...

 
DFF Forwarding
Consider if a packet is being sent from node 1 to node 4, in this case when node 3 sends the packet to node 4 but let us assume that the ACK from node 4 is lost then in that case, node 3 does not drop the packet and re-tries sending it again..rather it gives the packet back to where it came from and that in turn will try all its neighbor until it finally decides that the node is not reachable..The mechanism is not as simple as illustrated there are many other things to be taken care of such as Duplicate packets, Loops etc.. I will not get involved into all those in this post..But if you are curious then go ahead and read RFC 6971. There is a lot of research going in this field and many other protocols and I feel good to be a part of them.


If we need all of it shown in the video below, networking is a very crucial part of it. 



Stay tuned for more!

Thursday, November 28, 2013

Artificial Intelligence: The Best is Yet to come...

Wouldn't it be great to have all your work done by robos ? Yes indeed, looks like a fictional movie scene but researchers around the world are working hard on it to make that dream possible one day..and what I feel is that we are not very far from this given the invention in recent times. Well by now you would have figured out that I'm talking about Artificial Intelligence.

Artificial Intelligence also defined as the "study and design intelligent agents", where an intelligent agent perceives its environment and takes actions that maximize its chances of success. In this blog-post I want to show you guys some amazingg inventions that makes me believe that we are not too far from achieving the impossible. And I am sure after reading this, you will also feel the same.

 To start off with IBM has made a artificial intelligence program named Watson, which accesses roughly 200 million pages of information and is able to understand natural language and answer questions; the idea was that Watson's encyclopedic knowledge of medical conditions could aid a human expert in diagnosing illness, as well as contributing computer expertise elsewhere. IBM later announced that it could be used for a wide-range of call center, technical support and technical sales applications. Look at this video of how Watson beats the reigning champion.


Intelligent Transportation
We all know about Google's driver-less car which has been in and around the Bay Area for quite a bit of time now..Another major invention rather cheap is that of a computer scientist from Israel modified his Audi A7 by adding a camera and artificial-intelligence software, enabling the car to drive the 65 Km highway between Jerusalem and Tel Aviv.  The technique is different from Google in the sense that it uses cheaper methods of achieving its goal unlike Google's LIDAR which is expensive to deploy on production vehicles as sighted by Elon Musk.

Emotional Computing
Currently, at a preschool near University of California, San Deigo, a child-sized robot named Rubi plays with children. It listens to them, speaks to them and understands their facial expressions. Farther down the road, it is likely that applications will know exactly how people are reacting as the conversion progresses, a step well beyond Siri.

Robotics
A race is already under way to build robots that can walk, open doors, climb ladders and generally replace humans in hazardous situations. In November, DARPA had held a $2 million contest to build a robot that could take the place of humans in battle fields. For home, companies are designing robots that are more sophisticated than today's vacuum-cleaner robots. Hoaloha Robotics recently said, it planned to build robots for elder care, an idea that, if successful, might make it possible for more of the aging population to live independently.


Given all of this happening, I am desperately waiting to see some ground-breaking innovation that will blow our minds away!

Saturday, November 16, 2013

Computer Science: History simplified...

     Computer science, it sounds like the scientific study of computers, doesn't it? But Edgar Dijkstra famously said:  "Computer science is no more about computers than astronomy is about telescopes." Also, don't scientists study nature, not machines?
     So what is Computer Science about? In a word: algorithms; Obsessively Inventing, testing, debugging, and improving algorithms. The algorithms might be controlling the brain of a robot, encrypting a massive stock trade, simulating an ecosystem, chasing an avatar through a virtual swamp, attacking a drug lord's computer, or searching a network like the animations above.

     When I started to know about its history, it goes long back in age of B.C and I personally feel good to be
history of computer
a part of this discipline which has, and is still making a huge change in our day-to-day life. So, I thought I would present you with something very concise ranging from 1900s - today. Hope you like it!

Before 1900
It started with abacus and later another device called Antikythera mechanism which was found at some island in Greece, then came later somewhere in (1550-1617) the Napier's rod off course by Mr.Napier which simplified the task of multiplication. Of course there were a lot of other inventions which were done before we had the modern punched card invented in 1929 by Herman Hollerith

1900 - 1939 The Rise of Mathematics
Computer science was always about dealing with calculating numbers and doing as many calculations at one time.  This was the era when a lot was discovered in mathematics and also the very famous Turing machine was invented.

1940s
The second world war brought the era of digital computers and a lot of other inventions, the concepts of which are also used today ranging from ENIAC, EDVAC, EDSAC and also the magnetic core memory was invented..Some great ciphers like Enigma, Purple and many other were seen.

1950s
This is the era which defined the modern computer science and it's concepts. The first ever "bug" was discovered in 1947. The first compiler, FORTRAN was developed in 1957. We had the Dijkstra shortest path algorithm, the Turing Test.

1960s
Computer Science was formally defined as a discipline in this era. The first ever computer science department was at Purdue University in 1962 (I'm sure most of the computer science students like me would have no clue about it..). Also, the first ever Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965.
Operating systems saw some advances. BASIC was developed. Computer mouse was invented in 1968. The first ever microprocessor was designed in 1969 at Intel. ARPanet was developed,  a precursor to the Internet.

1970s
Some great new inventions which most people are aware of today was done in this era. The theory of database saw major advances like relational database. Unix was developed and also the C language. The era was a major player in the invention today with some other languages being developed like, Pascal, the RISC architecture, NP-complete problems, supercomputers, Usenet, RSA was invented.

1980s and 1990s
It was the birth of Apple Computers, Computer viruses, Parallel computers came into development, Quantum Computing, Biological Computing and so many many more...with time computers kept getting smaller and smaller; with the birth of nano-technology.

We have reached the space and back. Today we can make burgers without killing animals and print guns in 3D. With such ground-breaking research only time will show us what is coming next. I'm definitely excited to see what's coming our way.


Let me know what you guys feel !




Sunday, November 10, 2013

File Sharing: What's new in this ?


Imagine what would the world be like today if you didn't have access to files (music, picture, movies, etc...) Would you wait until days when it gets to you by a man travelling cross country ? Or given today will you carry all your favorite media along in a removable device (like CD's) wherever you go. Well, certainly not. 

We today are blessed by the technological innovation that has changed our life's by leaps and bounds. Today we find almost everything available online no matter where we are. Technology has made it possible for us to achieve all this. With more and more data moving to cloud services we are no longer in control of our data. As the concept of decentralized web is gaining traction: more and more people are thinking of ways to change.

This cause for this is obvious: the number of security flaws and privacy disasters that were made public has spiked recently. In April 2011, Dropbox changed its security terms of service to include that Dropbox has full access to user data. Similarly, Facebook also changed its privacy terms and conditions year on year, and from being a private communication platform to the one that shares users information with advertisers and business partners thereby limiting the user control to data.

Decentralized Applications:

The most popular amongst this is Diaspora. This project was started by 4 young programmers from NYC Courant Institute in 2010, and raised a record $200,000 from Kickstarter. It is touted as the "Facebook-killer" which allows users to have control of their data-security. This can be achieved by each user having their own Diaspora node. This essentially means allowing the users to have their Facebook server at home or ( anywhere else they prefer ). The Diaspora nodes are able to interact with each other to form one distributed social network. Furthermore, instead of users having to log into one single server, they can choose one of the many servers administered by different entities. This way they can decide whom to trust with their data and no entity has full access to it. (source)

Similarly, there are many other applications that have been developed recently, one such is buddycloud. The way this works is somewhat similar to Diaspora, but is working with W3C, Mozilla Firefox and XSF to build a foundation so that soon all products have a new social layer on top. This can be understood by a simple diagram,
buddycloud

This way each user can select which websites to share data with. Isn't that cool, you get to choose with whom to share your data.

Decentralized Storage.
With the issues in security over storing data in public servers who have access to all the clients data, ownCloud is being developed as a replacement to Dropbox. It allows users to make their own cloud and access their files from all the devices.
Likewise, the Locker project allows the user to set-up their own hosted server. This is achieved by installing their software on the client's server and providing features similar to what Dropbox does.

It is exciting to see that so many people feel that things have to change and come up with ideas and projects to make it happen. I'm sure we will see many exciting things in the future which would change the way we access and store our personal data over the public internet. 

Saturday, November 2, 2013

Data Structures: What to know about them


All of you must have asked this question to yourself in college ( at least for the Computer Science students while studying Data Structures ), "Will I really use all of these in my professional life" or "Where will I actually use them". And the answer to this becomes clear as we are more involved in developing projects/applications where in the computation time (i.e. time required for the completion of some task ) is much more important than anything else. You do not want to be waiting all-day long for a certain result to be shown to you until you proceed further. 

Well, this is where the expertise about data structures and algorithm comes in picture. Data Structure is an integral part of any computer science problem. It specifies a way of storing and organizing data in a computer memory so that it can be used efficiently. With this in mind; we have different requirements for different applications. In some applications we need data retrieval to be fast as compared to storing, but in others we would need data to be stored in sorted order and everytime we retrieve the smallest of them. This way, depending on the requirement of the application/problem in hand, we need to select the data structure accordingly.

Different data structures have different approaches to storing data and retrieving it from the memory. Some algorithms store data faster than retrieval like, Linkedlist, whereas for other it's the opposite. Data Structures have time and space complexities associated which estimates the amount of time it will take to do the required operation. Because, you cannot deploy your application and then check for how much time it takes. For these, mathematical expressions such as Fourier and discrete transform along with various other mathematical expressions are used to estimate such things.  


Well, there are loads of data structures to use from, when solving a problem, ranging from heaps, hash tables, trees, linked list, Queues, etc. To get the complete list of data structures you can see this wiki.
When writing about this blog-post, I thought to write about a data structure I came across recently, called Trie which is widely used in most applications and many of us are not even aware of it.


Trie is used in most of the application we use in our daily life.

It is used in search engines for storing the occurrence of the word in a particular URL, used in routers to match an IP address in a routing table. 
It has many advantages over other data structures in terms of searching, inserting and deleting, all of which can be found here 


It is a fairly simple data structure to understand but is used in all complex applications and yet most of us are not aware of it. I would highly recommend watching this video from IIT-Delhi. The concepts and all the problems of it are explained with solutions and reasoning. Also, this link has some good implementation details if you guys are interested in.
Well ,be it for technical interviews or problem solving my best bet would be to at least know what this data structure does.