Home | Conferences | Techknowledge | 9th National Conference

9th National Conference RSS

back 1 2 3 4 5 6 7 8 9 10 next total: 92 | displaying: 61 - 70

Knowledge Management- (A Tool For Managing Intellectual Excellence.)

Knowledge refers to a person’s state of being with respect to some body of information. These states include ignorance, awareness, familiarity, understanding, facility and so on. Knowledge management comprises a range of strategies and practices used in an organization to identify, create, represent, distribute, and enable adoption of insights and experiences. Such insights and experiences comprise knowledge, either embodied in individuals or embedded in organizational processes or practice. ... Read Full Article

Impact Of Information Technology On Employee Retention In Today’s Competitive Business Environment

In today’s business environment the explosion of Information technology has completely Transformed the structure of the organizations. But to retain the valuable employees is one of the biggest problems now-a-days. This paper focuses on the aspects of Information Technology and employee retention strategies and polices in assisting managers for achieving its objectives towards managing the employee retention by utilizing survey data drawn from various government and private departments, and to remain competitive in this area. ... Read Full Article

Data Mining Problem Solving Algorithms & Their Comparative Study

This paper presents the comparison schemes between mining algos identified by the IEEE International Conference on Data Mining (ICDM) are : C4.5, k-Means, Apriori, FP growth, Page Rank, Ada Boost, kNN, and CART. These algos are among the most influential data mining algorithms in the research community. With each algorithm, we provide a description of the algorithm, comparison between these algos, discuss the impact of the algorithm, and review current and further research on the algos. These algorithms cover classification. ... Read Full Article

A Comprehensive approach on Grid Computing

Today we are in the Internet world and everyone prefers to enjoy fast access to the Internet. But due to multiple downloading, there is a chance that the system hangs up or slows down the performance that leads to the restarting of the entire process from the beginning. This is one of the serious problems that need the attention of the researchers.So we have taken this problem for our research and in this paper we are providing a layout for implementing our proposed Grid Model that can access the Internet very fast. By using our Grid we can easily download any number of files very fast depending on the number of systems employed in the Grid. We have used the concept of Grid Computing for this purpose. ... Read Full Article

Software Security: An Emerging Problem In Software Engineering

The objective of this research paper is to discuss the software security as a problem in software engineering. The solution of this problem leads to the new constructive approach to the modeling, specification and analysis of application specific security requirements. The approach is based on a framework we developed before for generating and resolving obstacles to requirements achievement. Our framework intentional obstacles set up by attackers to break security goals. Attack tree are derived systematically through anti-goal refinement until leaf nodes are reached that are software vulnerabilities observable by the attacker or anti-requirements implementable by this attacker. New security requirements are then obtained as countermeasures by application of threat resolution operators to the anti requirements and vulnerabilities revealed by the analysis. The paper also introduces formal epistemic specification constructs and patterns that may be used to support a formal derivation and analysis process. The method is illustrated on a web-based system for which subtle attacks have been reported recently. ... Read Full Article

Simulating Using ns-2

1.    Introduction: ns-2 is called event simulator because ns-2 simulator has list of events. Process includes beginning of event and run it until it is ... Read Full Article

Network Simulation Using NCTUns

Introduction: Network simulator is software which is very helpful tool to develop, test, and diagnose any network protocol. To implement any kind of network with any kind of link bandwidth, propagations delay, routers etc we do not need to set up the actual network hence it is very economical and the results obtained are easier to analyze. A simulator needs to simulate various networking devices, application programs, network utility programs hence developing a simulator need great efforts. ... Read Full Article

E-Commerce, Its Impact & Trends & Opportunities On Supply Chain Management

Introduction:  Electronic commerce, commonly known as (electronic marketing) e-commerce or eCommerce, consists of the buying and selling of products or services over electronic systems such ... Read Full Article

Applying Knowledge Management (KM) For Sustainable Business Education

1.    Introduction: Today, Business Education system is under great pressure from industry (society) to deliver finished products (graduates and postgraduates) from its system so as ... Read Full Article

Emerging Paradigms Of DNA Based Computation

Introduction: DNA computing is a novel technology that seeks to capitalize on the enormous informational capacity of DNA, biological molecules that can store huge amounts of information and are able to perform operations similar to that of a computer, through the deployment of enzymes, biological catalysts that act like software to execute desired operations. The appeal of DNA computing lies in the fact that DNA molecules can store far more information than any existing conventional computer chip. Also, utilizing DNA for complex computation can be much faster than utilizing a conventional computer, for which massive parallelism would require large amounts of hardware, not simply more DNA. Scientists have found the new material they need to build the next generation of microprocessors. Millions of natural supercomputers exist inside living organisms, including your body. DNA (deoxyribonucleic acid) molecules, the material our genes are made of, have the potential to perform calculations many times faster than the world's most powerful human-built computers. DNA might one day be integrated into a computer chip to create a so-called biochip that will push computers even faster. DNA molecules have already been harnessed to perform complex mathematical problems. While still in their infancy, DNA computers will be capable of storing billions of times more data than your personal computer. The practical possibility of using molecules of DNA as a medium for computation was first demonstrated by Adleman in 1994. In 1994, Leonard Adleman took a giant step towards a different kind of chemical or artificial biochemical computer. He used fragments of DNA to compute the solution to a complex graph theory problem. Adleman's method utilizes sequences of DNA's molecular subunits to represent vertices of a network or `"graph". Thus, combinations of these sequences formed randomly by the massively parallel action of biochemical reactions in test tubes described random paths through the graph. Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Adleman's primary intention was to prove the feasibility of bio molecular computation but his work also gave an indication that the emergence of this new computational paradigm could provide an advantage over conventional electronic computing techniques. Specifically, DNA was shown to have massively parallel processing capabilities that might allow a DNA based computer to solve hard computational problems in a reasonable amount of time. However, Adleman's brute force search algorithm is not, and was never meant to be, a practical means of solving such problems; the volume of material required was found to increase exponentially as the complexity of the problem was increased. The main idea behind DNA computing is to adopt a biological (wet) technique as an efficient computing vehicle, where data are represented using strands of DNA. Even though a DNA reaction is much slower than the cycle time of a silicon-based computer, the inherently parallel processing offered by the DNA process plays an important role. ... Read Full Article

back 1 2 3 4 5 6 7 8 9 10 next total: 92 | displaying: 61 - 70