CA Prashant Jain

total: 227 | displaying: 1 - 20

Kvisimine Applied To Problems In Geographical Information System

Images are highly complex multidimensional signals, with rich and complicated information content in geographical information system. For this reason they are difficult to analyze through a unique automated approach. However a KVISIMINE scheme is helpful for the understanding of image content and data content. In this paper, describes an application K-MEAN clustering algorithm and image information mining for exploration of image information and large volumes data. Geographical Information System, is any system that captures, stores, analyzes, manages, and presents data that are linked to location. Technically, a GIS is a system that includes mapping software and its application to remote sensing, land surveying, aerial photography, mathematics, photogram metric, geography, and tools that can be implemented with GIS software Building a GIS is a fruitful area if one likes the challenge of having difficult technical problems to solve. Some problems have been solved in other technologies such as CAD or database mana- gement. However, GIS throws up new demands, therefore requiring new solutions. This paper has examine difficult problems, and to be solved and gives some indication of the state of the art of current solutions. ... Read more

Application of Cloud Computing In University Libraries

Technology is changing rapidly and is forming a layer that is touching each and every aspect of life like power grids, traffic control, medical and health care, water supply, food and energy and all the financial transaction of the world. Cloud computing is not an exception in changing the world. Cloud computing provides us virtually unlimited and on – demand computing resources. The infrastructure of cloud computing is such, that it encourages the development of innovation in every field. One such application of cloud is in university library .Emergence of cloud computing in libraries leaves us with many questions that are to be answered. The questions like:- 1. How library can use the cloud to provide effective information to the users? 2. How can the information be shared using the cloud? This paper tries to answer such questions & provides a comprehensive introduction to the application of cloud in university libraries.“The cloud is already there to burst only the library need to start thinking about how they may need to adjust services in order to effectively adapt to how users are interacting with it. ... Read more

Role of Information And Communication Technology In Teacher’s Education

The use of Information and Communication Technologies or (ICT) in academic field is a major point of discussion. Where educators and educational researchers are continuously working for innovative ways of using these technologies to support and enhance student outcomes in education.Use of ICT in education and teaching provide number of benefits, but to avail the facilities of ICT teachers need to have certain level of confidence. They should be prepared to use ICT in teaching and also trained in how to integrate ICT with teaching. The main purpose of this paper is to shows the gap between the ICT curriculum used in various universities to train teachers with what is expected from the international standards, national ICT in education policy and the development of the field of ICT itself. This paper also serves those of the challenges and issues faced by teacher educators in preparing and using ICT training curriculum for the next generation of teachers in the face of rising globalization. ... Read more

Mobile Commerce The Next Big Leap In India

Mobile commerce is a new trend where any transaction with a monetary value is conducted in a wireless environment by using mobile devices.While shopping if you are short of money or while traveling you feel that you don’t have money to pay to the driver or at the filling station you need to pay the money and you find your wallet empty then what will you do? But now a days the help is on the way and that is mobile commerce. It has grown to large extent in the world but in India it is still confined to basic banking transactions, purchase of travel tickets and payment of some utility bills. But now banks, cellular operators and payment service providers are finding solutions that can comply with regulatory guidelines. This paper mainly describes the sectors where mobile commerce can be used and the future challenges that we need to face in India. ... Read more

Knowledge Management – A Challenge For Smaller-Sized Enterprises

Today success and worth of a business depend more on its intellectual capital than on its physical. Therefore, Knowledge Management (KM) has become a critical input in the growth of SMEs. Globalization of supply chains, rapid technological advances, and superior returns on intellectual capital, growing importance of knowledge-intensive industries make KM a strategic tool in the growth and success of businesses. Access and integration of SMEs with regional, national and international supply chains require bridging the gaps between the requirements of supply chains and efficiency of SMEs' KM system besides their capability. KM process involves: knowledge capture, knowledge organizing and storage, distribution and sharing. Successful knowledge management results in the best possible means to apply and leverage the knowledge that has been captured, organized and stored, distributed and shared. It means that very little of the company's highly valued intellectual capital has escaped the knowledge management net. Virtually all the knowledge within the enterprise is harnessed, and will be used as part of the company's core business and competitive intelligence strategy. KM-enabled SMEs are essential for competitive and sustainable growth. In developing countries, a vast majority of SMEs are suffering from market failures due to insufficient provision for integrated, reliable, relevant and solution-oriented business information. SMEs need support for effective linking with global markets both for their inputs and outputs. Businesses leveraging knowledge resources can make decisions faster and closer to point of action. It is obvious that the smaller-sized company also needs to capture and intelligently exploit its knowledge. ... Read more

Dynamic Access And Emergence Mechanism For Collaborative Adaptive Systems From Agent’s Base

The main issue in human-machine interaction is obtained a "collaboration situation" between a human user and a computer system. The system must be attuned to the user, and the user to the system. "Good" conventions and guidelines shift the entire burden of adaptation to the user adoption, and the design restrictions that they impose are geared towards easing this task form the user to from specification and requirements of adoption by learning strategies [1].But providing such a list of technologies does not capture the essential feature of the intelligent interface research area: an intelligent interface must utilize technology to make an improvement: the resulting interface should be better than any other solution, not just different and technically more advanced but should be more capable on aspects of technology, eases of acceptance and flexible to move between environment: user modeling and natural language dialogue used to denote a scheme of that the system maintains, and adapts its behavior to interface access adoption by collaborative learning from system agent base and same time maintaining a system base for interface adoption between system to system environment and system to user access and process listing in selected services and grouping of protocol enabled agent base which maintain system environment access and learning base of adoption strategies. Development of adaptive systems reinforces the need to "know thy system environment ". The interface is set at design time but it is changeable for objective perspective. Adaptive systems may require more up-front environment analysis, since not only does the system designer need to know the user, that knowledge also must be embedded into the system ... Read more

Knowledge Management: A Tool For An Efficient Organization

The paper takes an in-depth look at knowledge management, a comparatively new business concept that is attracting the consideration of informed and global-oriented companies because of its promise to introduce new rudiments of flexibility and efficiency in the entire business spectrum - from management, plant and production to front line activities. This paper explains how knowledge management benefits businesses and the business community. It discusses how the need for active knowledge management in numerous companies is understood and acknowledged. The paper studies that how often, in practice, this understanding is misconstrued to a false belief that sophisticated and expensive information technology (IT) suffices for good knowledge management. It looks at the requirements of knowledge management and the role played by business intelligence in knowledge management. Further, the differences between organizations' actual and perceived success in knowledge management are discussed. The paper further examines how success, deemed as a firm's ability to generate sustainable growth and profits, is determined not only by knowledge management but intricately linked to the humans who seek straight forward business solutions and constructively counter challenges. ... Read more

Data Warehousing And Data Mining

This paper describes a new approach to fast multimedia information retrieval with data mining and data ware housing techniques. To tackle the key issues such as multimedia data indexing, similarity measures, search methods and query processing in retrieval for large multimedia data archives, we extend the concepts of conventional data warehouse and multimedia data warehouse for effective data representation and storage. In this study the technological advances are making this vision a reality for many organizations. Here, we would be discussing about the benefits that and organization will get through the use of data mining. We will be discussing about the various stages about the predictive data mining like, the initial exploration, model building or pattern identification with validation/verification, and deployment. It discusses about the various strategic applications of data mining and data warehousing. In addition, we propose a fuzzy neural network to provide automatic and autonomous classification for the retrieval outputs by integrating fuzzy logic technology and the Back Propagation Feed Forward (BPFF) neural network. A series of case studies are reported to demonstrate the feasibility of the proposed method. ... Read more

Knowledge Management- (A Tool For Managing Intellectual Excellence.)

Knowledge refers to a person’s state of being with respect to some body of information. These states include ignorance, awareness, familiarity, understanding, facility and so on. Knowledge management comprises a range of strategies and practices used in an organization to identify, create, represent, distribute, and enable adoption of insights and experiences. Such insights and experiences comprise knowledge, either embodied in individuals or embedded in organizational processes or practice. ... Read more

Impact Of Information Technology On Employee Retention In Today’s Competitive Business Environment

In today’s business environment the explosion of Information technology has completely Transformed the structure of the organizations. But to retain the valuable employees is one of the biggest problems now-a-days. This paper focuses on the aspects of Information Technology and employee retention strategies and polices in assisting managers for achieving its objectives towards managing the employee retention by utilizing survey data drawn from various government and private departments, and to remain competitive in this area. ... Read more

Data Mining Problem Solving Algorithms & Their Comparative Study

This paper presents the comparison schemes between mining algos identified by the IEEE International Conference on Data Mining (ICDM) are : C4.5, k-Means, Apriori, FP growth, Page Rank, Ada Boost, kNN, and CART. These algos are among the most influential data mining algorithms in the research community. With each algorithm, we provide a description of the algorithm, comparison between these algos, discuss the impact of the algorithm, and review current and further research on the algos. These algorithms cover classification. ... Read more

A Comprehensive approach on Grid Computing

Today we are in the Internet world and everyone prefers to enjoy fast access to the Internet. But due to multiple downloading, there is a chance that the system hangs up or slows down the performance that leads to the restarting of the entire process from the beginning. This is one of the serious problems that need the attention of the researchers.So we have taken this problem for our research and in this paper we are providing a layout for implementing our proposed Grid Model that can access the Internet very fast. By using our Grid we can easily download any number of files very fast depending on the number of systems employed in the Grid. We have used the concept of Grid Computing for this purpose. ... Read more

Software Security: An Emerging Problem In Software Engineering

The objective of this research paper is to discuss the software security as a problem in software engineering. The solution of this problem leads to the new constructive approach to the modeling, specification and analysis of application specific security requirements. The approach is based on a framework we developed before for generating and resolving obstacles to requirements achievement. Our framework intentional obstacles set up by attackers to break security goals. Attack tree are derived systematically through anti-goal refinement until leaf nodes are reached that are software vulnerabilities observable by the attacker or anti-requirements implementable by this attacker. New security requirements are then obtained as countermeasures by application of threat resolution operators to the anti requirements and vulnerabilities revealed by the analysis. The paper also introduces formal epistemic specification constructs and patterns that may be used to support a formal derivation and analysis process. The method is illustrated on a web-based system for which subtle attacks have been reported recently. ... Read more

Simulating Using ns-2

1.    Introduction: ns-2 is called event simulator because ns-2 simulator has list of events. Process includes beginning of event and run it until it is ... Read more

Network Simulation Using NCTUns

Introduction: Network simulator is software which is very helpful tool to develop, test, and diagnose any network protocol. To implement any kind of network with any kind of link bandwidth, propagations delay, routers etc we do not need to set up the actual network hence it is very economical and the results obtained are easier to analyze. A simulator needs to simulate various networking devices, application programs, network utility programs hence developing a simulator need great efforts. ... Read more

E-Commerce, Its Impact & Trends & Opportunities On Supply Chain Management

Introduction:  Electronic commerce, commonly known as (electronic marketing) e-commerce or eCommerce, consists of the buying and selling of products or services over electronic systems such ... Read more

Applying Knowledge Management (KM) For Sustainable Business Education

1.    Introduction: Today, Business Education system is under great pressure from industry (society) to deliver finished products (graduates and postgraduates) from its system so as ... Read more

Emerging Paradigms Of DNA Based Computation

Introduction: DNA computing is a novel technology that seeks to capitalize on the enormous informational capacity of DNA, biological molecules that can store huge amounts of information and are able to perform operations similar to that of a computer, through the deployment of enzymes, biological catalysts that act like software to execute desired operations. The appeal of DNA computing lies in the fact that DNA molecules can store far more information than any existing conventional computer chip. Also, utilizing DNA for complex computation can be much faster than utilizing a conventional computer, for which massive parallelism would require large amounts of hardware, not simply more DNA. Scientists have found the new material they need to build the next generation of microprocessors. Millions of natural supercomputers exist inside living organisms, including your body. DNA (deoxyribonucleic acid) molecules, the material our genes are made of, have the potential to perform calculations many times faster than the world's most powerful human-built computers. DNA might one day be integrated into a computer chip to create a so-called biochip that will push computers even faster. DNA molecules have already been harnessed to perform complex mathematical problems. While still in their infancy, DNA computers will be capable of storing billions of times more data than your personal computer. The practical possibility of using molecules of DNA as a medium for computation was first demonstrated by Adleman in 1994. In 1994, Leonard Adleman took a giant step towards a different kind of chemical or artificial biochemical computer. He used fragments of DNA to compute the solution to a complex graph theory problem. Adleman's method utilizes sequences of DNA's molecular subunits to represent vertices of a network or `"graph". Thus, combinations of these sequences formed randomly by the massively parallel action of biochemical reactions in test tubes described random paths through the graph. Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Adleman's primary intention was to prove the feasibility of bio molecular computation but his work also gave an indication that the emergence of this new computational paradigm could provide an advantage over conventional electronic computing techniques. Specifically, DNA was shown to have massively parallel processing capabilities that might allow a DNA based computer to solve hard computational problems in a reasonable amount of time. However, Adleman's brute force search algorithm is not, and was never meant to be, a practical means of solving such problems; the volume of material required was found to increase exponentially as the complexity of the problem was increased. The main idea behind DNA computing is to adopt a biological (wet) technique as an efficient computing vehicle, where data are represented using strands of DNA. Even though a DNA reaction is much slower than the cycle time of a silicon-based computer, the inherently parallel processing offered by the DNA process plays an important role. ... Read more

Software Professionals Use Object Oriented Data Modeling Instead Of Traditional Relational Data Modeling

The purpose of this paper is to explain why object oriented data modeling is more popular than relational data modeling. A data model is a logic organization of the real world objects (entities), constraints on them, and the relationships among objects. Relational model is very simple since data in represented in the form of relations that are depicted by use of two-dimensional tables. Rows in the table represent records and Columns represent attributes of the entity. The basic concept in the relational model is that of a relation. In object-oriented model main construct is an object. As in relational model, there are relations similarly we have objects in OO data modeling. So first thing in OO model is to identify the objects for the systems. Examining the problem statement can do it. Other important task is to identify the various operations for these objects. It is easy to relate the objects to the real world entity. The object-oriented approach has proved to be especially fruitful in application areas, such as the design of geographical information systems which have a richly structured knowledge domain and are associated with multimedia databases.Relational data modeling is different from Object Oriented data modeling because it focuses solely on data while object oriented data models focuses on both the behavior and data aspects of your domain. OODBMS are faster than relational DBMS because data isn’t stored in relational rows and columns but as objects. Objects have a many to many relationship and are accessed by the use of pointers. ... Read more

total: 227 | displaying: 1 - 20