Home | Conferences | Mapping for Excellence | 11th National Management Conference

11th National Management Conference RSS

back 1 2 3 4 5 6 7 8 next total: 114 | displaying: 61 - 75

The Pricing of IPOs & FPOs in Indian Capital Market

Indian capital market has always proved itself a prominent market for public issues. For the common man though the retail participation is very less as well as for Qualified Institutional Bidders (QIBs), Initial Public Offerings (IPOs) and Follow- on Public Offerings (FPOs) has always been a very profitable investment instrument. During the year 2010, 72 companies had been raised Rs.69192 crore from the primary market through IPOs and FPOs which is over three times higher than 21 in 2009. Out of 72 issues, 64 were IPOs that has raised Rs.31615 crore and remaining 08 were case of FPOs which has raised 31577 crore. The study “Pricing of IPOs & FPOs in Indian Capital Market” pays attention to the performance of the IPOs and FPOs in the secondary market. The pricing of the IPOs and FPOs in 2010 has been studied with the help of secondary data collected from NSE, BSE and other relevant data sources. The researchers assume that the investments in IPOs and FPOs are very safe, risk free, and make good returns. It was found from the research that returns out of IPOs and FPOs of some companies during the short term is very promising but some of have given less as its costs in issue price. In the recent past several large equity offerings including those from reputable business houses has failed to reach their price targets. Out of the 72 companies that have raised Rs. 69192 crore from Investors in India through IPOs and FPOs many are quoting below their issue price. The above research deals with these phenomena whether these issues are being offered at fair price or not, is investing in IPOs or FPOs wise decision or investor should wait till its best price in secondary market. Key words: Initial public offering (IPO), Follow-on Public Offering (FPO), Qualified Institutional Bidders (QIBs), Short-Term Returns, Performance Review, Pricing of IPOs and FPOs.

Financial Markets and Instruments

Financial markets are used to match those who want capital to those need it. A financial market is a mechanism that allows people to buy and sell financial securities such as stocks and bonds, commodities and other fungible items of value at low transaction costs and at prices that reflect the efficient-market hypothesis. Financial markets facilitate: • The raising of capital in the capital markets. • The transfer of risk in the derivative markets. • International trade in the currency markets. A financial instrument is either cash; evidence of an ownership interest in an entity; or a contractual right to receive, or deliver, cash or another financial instrument. A real or virtual document is that representing a legal agreement involving some sort of monetary value. A sharp acceleration in the pace of innovation, deregulation and structural changes in the recent years has transformed the financial system in important ways.

Financial Markets & Instruments: Factoring and Forfeiting

The main problem nowadays when trading with foreign countries is to obtain payments from importers. This paper focuses on how the two major innovative tools of financing viz: Factoring and Forfeiting helps the Financing companies to offer financial support to traders in exchange for fees, and guarantees and the problems they faced. For the exporter, both of these methods maximize cash flow, reduce transaction risks, and may enhance competitiveness by offering flexible payment terms to the buyer.

Commercialization of Microfinance in India

Microfinance is gathering momentum to become a significant force in India. This paper discusses the growth, transformation and commercialization of microfinance organizations (MFO) in India. The basic argument is that most of the early microfinance in India happened through donor and philanthropic funds. These funds came into not-for-profit organizations. However as the activities scaled up, it was imperative to move to a commercial format. The aim is to examine the growth imperatives and the transformation process of Indian MFOs.

FDI in Indian Retail Sector: Good or Bad

Retailing is the interface between the producer and the individual consumer buying for personal consumption. The retail industry in India is of late often being hailed as one of the sunrise sectors in the economy. The recent clamor about opening up the retail sector to Foreign Direct Investment (FDI) becomes a very sensitive issue, with arguments to support both sides of the debate. It is widely acknowledged that FDI can have some positive results on the economy, triggering a series of reactions that in the long run can lead to greater efficiency and improvement of living standards, apart from greater integration into the global economy. By infusion of FDI in retail trade the consumer will be benefited by both price reductions and improved selection, brought about by the technology and know-how of foreign players in the market. This in turn can lead to greater output and domestic consumption. But the most important factor against FDI driven “modern retailing” is that it is labor displacing to the extent that it can only expand by destroying the traditional retail sector. This paper tries to throw light on the positive and negative effects of FDI in Indian Retail industry.

6 An Empirical Study of Non-Linear Relationship between World Oil Price & Indian Stock Exchange

The research paper aim to formulate econometric model to investigate the relationship between the world oil price changes & stock exchange returns in India in the presence of regime switching dynamic. Two regimes is the multivariate Markov Switching vector auto regression (MS-VAR) model with regime shifts in both the mean and the variance is used to extract common regime switching behaviors from the price index series it has been interpret that the stock return series evidence of relationship exists among the series. The study interacts with the oil price changes and the economic measure parameter has received considerable attention from policymakers. The estimated MS-VAR model reveal that as the oil price increase it will follow by the rises of stock price index. the MS-VAR model is a good option for estimating the Non-Linear relationship between the World Oil Price & Indian Stock Exchange.

An Enhanced Congestion Control Mechanism for Mobile Ad-Hoc Network

The very dynamic nature of mobile ad-hoc network creates great challenges for routing protocols. Transmission control protocol (TCP) provides connection oriented, reliable end to end mechanism. TCP congestion control mechanism is not directly suitable for wireless networks. Many TCP congestion control mechanisms have been presented for improvement. But these schemes are not sufficient in controlling congestion. In this paper, an improved mechanism for TCP congestion control is presented. Proposed scheme calculate sending window after each transmission, according to number of destroyed or corrupted packet. It results in less packet drop in transmission. The comparative study of Enhanced TCP (ETCP) with other TCP variants also presented with variation in speed of node, pause time and number of nodes in network. Implementation and analysis of proposed mechanism generated fewer overheads and improved reliability with small variances of throughput and delay. Implementations and Simulations were performed in QualNet 5.0 simulator.

Hold your Session: An attack on JAVA Session- ID Genration

HTTP session-id’s take an important role in almost any web Site today. This paper presents a cryptanalysis of Java Servlet 128-bit Session-id and an efficient practical prediction algorithm. Using this attack an adversary may impersonate a legitimate client. Through the analysis we also present a novel, general space-time tradeoff for secure Pseudo random number generator attacks.

Timely Computing Base Transaction in DBMS

Current database systems are very often based on large scale, unpredictable and unreliable infrastructures. On-time data management is becoming a key difficulty faced by the information infrastructure of most organizations. In fact, database applications for critical areas are increasingly giving more importance to the timely execution of transactions. Database applications with timeliness requirements have to deal with the possible occurrence of timing failures, when the operations specified in the transaction do not complete within the expected deadlines. In spite of the importance of timeliness requirements in database applications, typical commercial DBMS do not assure any temporal properties, not even the detection of the cases when the transaction takes longer than the expected/desired time. In this paper, we propose an architectural construct and programming model, which address this problem. We assume the existence of a component that is capable of executing timely functions, however asynchronous the rest of the system may be also we discusses the problem of timing failure detection in database applications and proposes a transaction programming approach to help developers in programming database applications with time constraints.

Computational Intelligence: Study of Specialized Methodologies of Soft Computing in Bioinformatics

Soft Computing, the study of specialized methodologies, has broad area in bioinformatics now a day. The study for soft computing has many standard methodologies than bioinformatics but none of them provides best method aspect. This leads to study of specialized methodologies of soft computing in bioinformatics. This paper provides a process to massively integrate the aspects and experiences in the different core subjects such as biology, medicine, computer science, engineering, chemistry, physics, and mathematics. Recently the use of soft computing tools for solving bioinformatics problems have been gaining the attention of researchers because of their ability to handle imprecision, uncertainty in large and complex search spaces.

Save-Energy VoIP over Wireless LANs

The key idea of saving energy of the wireless interface is to allow it to sleep as much as possible reducing the time spent in idle state. There is typically about one order of magnitude difference between the power consumption in the idle and sleep states. This is a difficult problem since the radio may not know when exactly it has to wake up to receive incoming packets and will lose them if it stays in the sleep state. A standardized solution to this issue is the Power Save Mode (PSM) which was introduced in the IEEE 802.11 standard for infrastructure WLANs. Emerging dual-mode phones incorporate a Wireless LAN (WLAN) interface along with the traditional cellular interface. The additional benefits of the WLAN interface are, however, likely to be outweighed by its greater rate of energy consumption. This is especially of concern when real-time applications that result in continuous traffic, are involved. WLAN radios typically conserve energy by staying in sleep mode. With real-time applications like Voice over Internet Protocol (VoIP), this can be challenging since packets delayed above a threshold are lost. Moreover, the continuous nature of traffic makes it difficult for the radio to stay in the lower power sleep mode enough to reduce energy consumption significantly. In this work, we propose the Green Call algorithm to derive sleep/ wake-up schedules for the WLAN radio to save energy during VoIP calls while ensuring that application quality is preserved within acceptable levels of users. We evaluate Green Call on commodity hardware and study its performance over diverse network paths and describe our experiences in the process. We further extensively investigate the effect of different application parameters on possible energy savings through trace-based simulations. We show that, in spite of the interactive, real-time nature of voice, energy consumption during calls can be reduced by close to 80 percent in most instances. Index Terms—Voice over IP (VoIP), wireless LANs, energy consumption, portable communication devices, Internet.

Intrusion Detection using k-mean Algorithm via Support Vector Machine

It is unrealistic to prevent security breaches completely using the existing security technologies. The intrusion detection plays an important role in network security. However, many current intrusion detection systems (IDSs) are signature-based systems. The signature based IDS also known as misuse detection looks for a specific signature to match, signaling an intrusion. Provided with the signatures or patterns, they can detect many or all known attack patterns, but they are of little use for as yet unknown attack methods. The rate of false positives is small to nil but these types of systems are poor at detecting new attacks, variations of known attacks or attacks that can be masked as normal behavior. Statistical-Based Intrusion Detection Systems (SBIDS) can alleviate many of the aforementioned pitfalls of a Signature Based IDS. Statistical-Based IDS perform better than signature based IDS for novelty detection. Novelty detection i.e. detection of new attack is very important for intrusion detection system. Researchers have evaluated various classification techniques for intrusion detection. This work evaluates support vector machine (SVM) based classifier over benchmark dataset. This dissertation explores non linear SVM both binary and multiclass over KDD 1999 dataset. The non linear SVM maps input feature to the feature space by using kernel function. In this dissertation the performance of SVM is evaluated using different kernel functions. This work also tries to find optimal kernel function using kernel width delimiter.

Writing Good Software Engineering Research Papers

Software engineering researchers solve problems of several different kinds. To do so, they produce several different kinds of results, and they should develop appropriate evidence to validate these results. They often report their research in conference papers. I analyzed the Abstracts of research papers submitted to ICSE 2002 in order to identify the types of research reported in the submitted and accepted papers, and I observed the program committee discussions about which papers to accept. This report presents the research paradigms of the papers, common concerns of the program committee, and statistics on success rates. This information should help researchers design better research projects and write papers that present their results to best advantage.

Re- Joining to Internet Analysis of User by Markov Chain Model

The need of internet is widely spread throughout the world. A large number of people are joining the user group everyday. Places where broadband facilities are not available, people are using dial-up based connectivity, especially in rural areas. This suffers from troubles of frequent non-connectivity and call-disconnectivity. A user leaves the computer terminal after multiple failure attempts of non-connectivity. Some hard-core and dedicated users of dial-up based connection re-join the attempt process after being away for time being. If re-joining has some probability then this needs to study intensively. This paper is an attempt to analyze the rejoining probability by using a Markov chain model over consumer behavior and internet traffic sharing.

Security Based Comparison of Relational and Object Oriented Distributed Database Model

In today environment Distributed database is mainly used by large organization for there striking features. When we develop a distributed database there are many features which may concern but the Security concerns is the main feature. When choosing between the object oriented Model and the relational model, many factors should be considered. The most important of these factors are single level and multilevel access controls, protection against inference, and maintenance of integrity etc. When determining which distributed database model will be more secure for a particular application, the decision should not be made purely on the basis of available security features. One should also question the efficiency of the delivery of these features. Do the features provided by the database model provide adequate security for the Intended application? Does the implementation of the security controls add an unacceptable amount of Computational overhead? In this paper, the security strengths and weaknesses of both database models and the Special problems found in the distributed environment are discussed.

back 1 2 3 4 5 6 7 8 next total: 114 | displaying: 61 - 75