What’s granular computing?

Print anything with Printful



Granular computing blends precise information with general detail and incorporates uncertainties and probabilities into computers. It is used to structure problem solving, group data in databases, and organize information for data mining. It helps make computers work more like human thought processes and is used in many business, medical, and security computing systems.

Granular computing is a troubleshooting method that blends precise information with more general detail. It focuses on how to incorporate uncertainties and probabilities into computers. Originally devised in the 1970s, this method of theoretical computer science has been incorporated into computer programming and artificial intelligence. The principle of fuzzy sets was developed in the 1960s to handle uncertainty; both fuzzy ensemble and probability theory are typically used in granular computing. This method has often been referred to by terms like raw set theory, data compression, and machine learning.

Used as a way to structure problem solving and general thinking, granular computing has been modeled in different ways. It is often used to group data in large databases and has sometimes been used for data abstraction and generalization to organize information. This is important for data mining because people often don’t think about information in specific, complex numerical terms. Computers can analyze language to evaluate how to use search terms, so granular computation is often part of how search results are acquired.

Data mining in an enterprise network often involves granular processing. Internet search engines usually do the same. General search terms can then lead a person to a website with more details on a topic. In a typical database, information is organized into different classes, clusters and subsets depending on a number of variables. Business computer programs can use this data classification method to organize a lot of information; employees can then capture information when it’s needed most.

Humans generally don’t think like computers. Words are used to represent abstract ideas and often make details less precise. It is usually necessary to replace words and sentences with complex ideas; the brain typically doesn’t calculate details like speed or precise distance, for example. A sensor connected to a computer can do this. The brain can determine whether something tastes or feels good, but it generally can’t count a large number of things unless that information is already available.

Granular computing, therefore, helps make computers work more like thought processes taking place in a person’s head. Typically there are numbers, elements of computer language, and probability constraints in between. The end result is a computer program that can interpret how people communicate with a computer interface. Enabled by years of theoretical computer science, this concept is used in many business, medical, and security computing systems and can be applied to the Internet as well.




Protect your devices with Threat Protection by NordVPN


Skip to content