Earn money with AlertPay


Thursday, July 29, 2010

Artificial intelligence substation controller

     Controlling a substation by a fuzzy controller speeds up response time diminishes up the possibility of risks normally related to human operations. The automation of electric substation is an area under constant development Our research has focused on, the Selection of the magnitude to be controlled, Definition and implementation of the soft techniques, Elaboration of a programming tool to execute the control operations. it is possible to control the desired status while supervising some important magnitudes as the voltage, power factor, and harmonic distortion, as well as the present status. The status of the circuit breakers can be control by using a knowledge base that relates some of the operation magnitudes, mixing status variables with time variables and fuzzy sets .The number of necessary magnitudes to a supervise and to control a substation can be very high in the present research work, many magnitudes were not included .To avoid the extensive number of required rules nevertheless , controlling a substation by a fuzzy controller has the advantage that it can speed up the response time and diminish the possibility of risks normally related to human operations.

Download Here : Artificial intelligence substation controller.doc


    The application of the research work done is directed towards the people who are visually impaired. People suffering from low vision to, people who are completely blind will benefit from this project. The findings regarding biocompatibility of implant materials will aid in other similar attempts for in human machine interface. Congenital defects in the body, which cannot be fully corrected through surgery, can then be corrected.

 There has been marked increase in research and clinical work aimed at understanding low vision. Future work has to be focused on the optimization and further miniaturization of the implant modules. Commercially available systems have started emerging that integrates video technology, image processing and low vision research.

   Implementation of an Artificial Eye has advantages. An electronic eye is more precise and enduring than a biological eye and we cannot altogether say that this would be used only to benefit the human race. In short successful implementation of a bioelectronic eye would solve many of the visual anomalities suffered by human’s to date.

Download Here : ARTIFICIAL EYE.doc


     In this age of high speed communications it has become quite necessary for a person to stay connected to internet where ever and whenever possible. To fulfill these needs of the people, WAP is being used. Using this technology, web pages can be viewed in simple hand-held, WAP enabled devices. This is made possible by using various communication architectures like GSM, CDMA etc. Recently a new standard for Wireless Internet called 1xEV-DO, has been adopted which is being billed as the basis for the next generation high-speed wireless internet access systems. In this paper we briefly introduce the working of a typical WAP enabled system and later investigate as to how the recently adopted 1xEV-DO outperforms the existing architectures. We also study its compatibility with different operating systems and its efficient utilization of the bandwidth with minimum spectrum usage.



The ability to perform long, accurate molecular dynamics (MD) simulations involving proteins and other biological macromolecules could in principle provide answers to some of fields of biology, chemistry, and medicine. A wide range of the most important currently outstanding questions in the biologically interesting phenomena, however, occur over timescales on the order of a millisecond—several orders of magnitude beyond the duration of the longest current MD simulations.

I describe a massively parallel machine called Anton,which should be capable of executing millisecond-scale classical MD simulations of such biomolecular systems.The machine, which is scheduled for completion by the endof 2008, is based on 512 identical MD-specific ASICs that interact in a tightly coupled manner using a specialized highspeed communication network. Anton has been designed to use both novel parallel algorithms and special-purpose logic to dramatically accelerate those calculations that dominate the time required for a typical MD simulation. The remainder of the simulation algorithm is executed by a programmable portion of each chip that achieves a substantial degree of parallelism while preserving the flexibility necessary to accommodate anticipated advances in physical models and simulation methods.

Download Here : Anton.pdf


Analog-Digital Hybrid Modulation for improved efficiency over Broadband Wireless Systems

This paper seeks to present ways to eliminate the inherent quantization noise component in digital communications, instead of conventionally making it minimal. It deals with a new concept of signaling called the Signal Code Modulation (SCM) Technique. The primary analog signal is represented by: a sample which is quantized and encoded digitally, and an analog component, which is a function of the quantization component of the digital sample. The advantages of such a system are two sided offering advantages of both analog and digital signaling. The presence of the analog residual allows for the system performance to improve when excess channel SNR is available. The digital component provides increased SNR and makes it possible for coding to be employed to achieve near error-free transmission.Index Terms—SCM, Hybrid Modulation, Quantized residual amplification.

Download Here : Analog-Digital Hybrid Modulation
for improved efficiency over Broadband Wireless Systems.pdf


Agile – denoting “the quality of being agile; readiness for motion; nimbleness,Activity, dexterity in motion” – software development methods are attempting to offer an answer to the eager business community asking for lighter weight along with faster and nimbler software development processes. This is especially the case with the rapidly growing and volatile Internet software industry as well as for the emerging mobile application environment. The new agile methods have evoked a substantial amount of literature and debates. However, academic research on the subject is still scarce, as most of existing publications are written by practitioners or consultants. This publication has three purposes. First, it proposes a definition and a classification of agile software development approaches. Second, it analyses four software development methods that can be characterized as being “agile” against the defined criteria. Third, it compares these methods and highlights their similarities and differences. Based on this analysis, future research needs are identified and discussed


Affective computing aims at developing computers with understanding capabilities vastly beyond today’s computer systems. Affective computing is computing that relates to, or arises from, or  deliberately influences emotion. Affective computing also involves giving machines skills of emotional intelligence: the ability to recognize and respond intelligently to emotion, the ability to appropriately express (or not express) emotion, and the ability to manage emotions. The latter ability involves handling both the emotions of others and the emotions within one self.

Today, more than ever, the role of computers in interacting with people is of importance. Most computer users are not engineers and do not have the time or desire to learn and stay up to date on special skills for making use of a computer’s assistance. The emotional abilities imparted to computers are intended to help address the problem of interacting with complex systems leading to smoother interaction between the two. Emotional intelligence that is the ability to respond to one’s own and others emotions is often viewed as more important than mathematical or other forms of intelligence. Equipping computer agents with such intelligence will be the keystone in the future of computer agents.

Download Here : Affective-computing.pdf

Aeronautical Communications

The demand for making air traveling more 'pleasant, secure and productive for passengers is one of the winning factors for airlines and aircraft industry. Current trends are towards high data rate communication services, in particular Internet applications. In an aeronautical scenario global coverage is essential for providing continuous service. Therefore satellite communication becomes indispensable, and together with the ever increasing data rate requirements of applications, aeronautical satellite communication meets an
expansive market.

Wireless Cabin (IST -2001-37466) is looking into those radio access technologies to be transported via satellite to terrestrial backbones . The project will provide UMTS services, W-LAN IEEE 802.11 b and Blue tooth to the cabin passengers. With the advent of new services a detailed investigation of the expected traffic is necessary in order to plan the needed capacities to fulfill the QoS demands. This paper will thus describe a methodology for the planning of such system.

In the future, airliners will provide a variety of entertainment and communications equipment to the passenger. Since people are becoming more and more used to their own communications equipment, such as mobile phones and laptops with Internet connection, either through a network interface card or dial-in access through modems, business travelers will soon be demanding wireless access to communication services.

Download Here : Aeronautical Communications.pdf

Adaptive Routing in Adhoc Networks

The dynamics of an ad hoc network are a challenge to protocol design because mobility inevitably leads to unstable routing, and consequently flows encounter fluctuations in resource availability on various paths during the lifetime of a session. This has become serious, especially for those protocols based on single-path reservation, as frequent reservation and restoration of reservation-based flows increase the instability of

Advances in wireless research are focusing more and more on the adaptation capability of routing protocols due to the interrelationship among various performance measures such as those related to topological changes (link breakages, node mobility,etc.) and quality of service (QoS) parameters (load, delay, etc).

After giving a more detailed discussion of the existing work in adaptive routing, we propose a new routing protocol for adhoc wireless networks - Multipath Source Routing (MSR), which is an extension of DSR(Dynamic Source Routing) that incorporates the multipath mechanism into DSR. Based on the measurement of RTT(Round Trip Time), we propose a scheme to distribute load among multiple paths.

MSR is an adaptive routing for ad hoc networks. It considers the two fundamental issues in its design. MSR may adapt to topology changes by retaining the route discovery and route maintenance mechanism of DSR. In addition, MSR employs a probing-based load-balancing mechanism. Simulation results show that MSR can improve the packet delivery ratio and the throughput of TCP and UDP, and it reduces the end-to-end delay and the average queue size while adding little overhead.

As a result, MSR decreases network congestion and increases the path fault tolerance quite well.

Download here : Adaptive Routing in Adhoc Networks.pdf

Active blind noise suppression in some speech processing

In many applications of speech processing the noise reveals some specific features. Although the noise could be quite broadband, there are a limited number of dominant frequencies, which carry the most of its energy. This fact implies the usage of narrow-band notch filters that must be adaptive in order to track the changes in noise characteristics. In present contribution, a method and a system for noise suppression are developed. The method uses adaptive notch filters based on second-order Gray-Markel lattice structure.
The main advantages of the proposed system are that it has very low computational complexity, is stable in the process of adaptation, and has a short time of adaptation. Under comparable SNR improvement, the proposed method adjusts only 3 coefficients against 250-450 for the conventional adaptive noise cancellation systems. A framework for a speech recognition system that uses the proposed method is suggested.

Download here : Active blind noise suppression in some speech processing.doc


Mentally, driving is a highly demanding activity - a driver must maintain a high level of concentration for long periods and be ready to react within a split second to changing situations. In particular, drivers must constantly assess the distance and relative speed of vehicles in front and adjust their own speed accordingly. Those tasks can now be performed by Adaptive Cruise Control (ACC) system, which is an extension of the conventional cruise control system.

Like a conventional cruise control system, ACC keeps the vehicle at a set constant speed. The significant difference, however, is that if a car with ACC is confronted with a slower moving vehicle ahead, it is automatically slowed down and then follows the slower vehicle at a set distance. Once the road ahead is clear again, the ACC accelerates the car back to the previous set cruising speed. In that way, ACC integrates a vehicle harmoniously into the traffic flow.


Speech Compression - a novel method

This paper illustrates a novel method of speech compression and transmission. This method saves the transmission bandwidth required for the speech signal by a considerable amount. This scheme exploits the property of low pass nature of the speech signal. Also this method applies equally well for any signal, which is low pass in nature, speech being the more widely used in Real Time Communication, is highlighted here.

As per this method, the low pass signal (speech) at the transmitter is divided into set of packets, each containing, say N number of samples. Of the N samples per packet, only certain lesser number of samples, say alone are transmitted. Here is less than unity, so compression is achieved. The N samples per packet are subjected to a N-Point DFT. Since low pass signals alone are considered here, the number of significant values in the set of DFT samples is very limited. Transmitting these significant samples alone would suffice for reliable transmission. The number of samples, which are transmitted, is determined by the parameter .

The parameter is almost independent of the source of the speech signal. In other methods of speech compression, the specific characteristics of the source such as pitch are important for the algorithm to work.

Download: Speech Compression - a novel method.doc

A Vectorizing Compiler for Multimedia Extension

This Seminar Report, describes an implementation of Vectorizing Compiler for Intel’s MMX ( MultiMedia Extensions). This compiler would identify data parallel sections of the code using scalar expansion and array dependence analysis. To enhance the scope for application of the subword semantics, this
compiler performs several code transformations. These include strip mining, scalar expansion, grouping and reduction, loop fission and distribution. There after inline assembly instructions corresponding to the data parallel
sections are generated. This compiler uses Stanford University Intermediate Format(SUIF), a public domain compiler tool, for implementation.

The performance of the code generated by this compiler is evaluated for a multimedia benchmarks. Initial performance results reveal that, this
compiler generated code produces a reasonable performance improvement (speedup of 2 to6.5) over the code generated without the vectorizing
transformations/inline assembly.

Download: A Vectorizing Compiler for Multimedia Extension.pdf

Wednesday, July 28, 2010

Wireless Application Protocol

Wireless Application Protocol

The primary means of communicating information of these days are voice and Internet. The unlimited accesses to Internet and sheer number of people connected to the Internet have made industry captain realize its potential. The industry now plans its marketing and communication strategies around the Internet. Today every banking to education, research to health-care is affected by it. E-mail is the way to communicate today. Practically who use the Internet uses E-mail. The wireless technologies and the Internet were growing separately. The wireless industry initially struggled within a number of issues like low bandwidth and low connection stability, to bring Internet to its users. They came together to form a common forum to tackle these issues. This forum is called the WAP.The wireless application protocol.

Download Here : Wireless Application Protocol.pdf


Honeypot is an exciting new technology with enormous potential for the security community. It is resource which is intended to be attacked and compromised to gain more information about the attacker and his attack techniques. They are a highly flexible tool that comes in many shapes and sizes. This paper deals with understanding what a honeypot actually is ,and how it works.

There are different varieties of honeypots. Based on their category they have different applications. This paper gives an insight into the use of honeypots in productive as well as educative environments.

This paper also discusses the advantages and disadvantages of honeypots , and what the future hold in store for them.

Download Here :Honeypot.PDF


Biometrics is an automated method of capturing a person’s unique biological data that distinguishes him or her from another individual. Iris recognition has emerged as one of the most powerful and accurate identification techniques in the modern world. It has proven to be most fool proof technique for the identification of individuals with out the use of cards. PIN’s and passwords. It facilitates automatic identification where by electronic transactions or access to places, information or accounts are made easier, quicker and more secure.

Download Here : IRIS Scan.pdf

Intelligent Network (IN)

Intelligent Network (IN)

An Intelligent Network (IN) is a service-independent telecommunications network. That is, intelligence is taken out of switches and placed in computer nodes that are distributed throughout the network. This provides the network operator with the means to develop and control services more efficiently. In IN the services are provided independently of the bearer networks or equipment vendors. The IN is essentially an architecture which separates the service logic from the telephone exchanges, enabling the establishment of an open platform for uniform service creation, implementation and management. It enables advanced customer orientated Services to be rapidly and cost effectively introduced.

Download Here : Intelligent Network (IN).pdf


New and increasingly advanced data services are driving up wireless traffic, which is being further boosted by growth in voice applications in advanced market segments as the migration from fixed to mobile voice continues. This is already putting pressure on some networks and may be leading to difficulties in maintaining acceptable levels of service to subscribers.
For the past few decades the lower band width applications are growing but the growth of broad band data applications is slow. Hence we require technology which helps in the growth of the broad band data applications. WiMAX is such a technology which helps in point-to-multipoint broadband wireless access with out the need of direct line of sight connectivity with base station.
This paper explains about the WiMAX technology, its additional features in physical layer and MAC layer and the benefits of each feature.
This paper focuses on the major technical comparisons (like QOS and coverage) between WiMAX and other technologies. It also explains about the ability of the WiMAX to provide efficient service in multipath environment.

Download : Wimax_Emerging_wireless_technology.pdf

64 BIT MicroProcessor

Internet commerce and large database applications are dealing with ever- increasing quantities of data, and demands placed on both server and workstation resources are increasing correspondingly. One demand is for more memory than the 4 GB provided by today’s 32-bit computer architectures. Itanium’s ability to address a flat 64-bit memory address space in the millions of gigabytes has been the focus of attention. Beyond very large memory (VLM) support, however, other traits, including a new Explicitly Parallel Instruction Computing (EPIC) design philosophy that will handle parallel processing differently than previous architectures, speculation, predication, large register files, a register stack and advanced branch architecture. IA-64 also provides an enhanced system architecture supporting fast interrupt response and a flexible, large virtual address mode. The 64-bit addressing enabled by the Intel Itanium architecture will help overcome the scalability barriers and awkward, maintenance-intensive partitioning directory schemes of current directory services on 32-bit platforms. , Intel has been assiduous in providing backward compatibility with 32-bit binaries (IA-32), from the x86 families. The Itanium has a complex, bleeding edge, forward looking processor family that holds promise for huge gains in processing power.

Download Here : 64 BIT MicroProcessor

SAN- Storage Area Network

The recent explosion in e-business activity and internet commerce has provided organization with unlimited opportunities for developing new information delivery channel. Data is today perceived to be the key asset for many organizations such as banking, stock exchange, government record etc. This has generated an explosive demand for data storage and this demand can be addressed by deploying SAN. The activity to share a single large storage device across many server or application has made SAN an attractive option in today’s market place.
As organization continue to broaden there reach to business partners and customers around the globe, they expose key IT system to a wider range of potential security threats. Today data theft, fraud, hacker attempts, and human error increasingly threaten security of information exchange within the enterprise and across the public networks, such as the internet. In order to protect the key data stored, storage networking venders are rapidly deploying and developing security frameworks that help ensure safe reliable data processing throughout a storage area networks(SAN).

Download Here : SAN- Storage Area Network.pdf

Digital Signature

Digital Signature

The electronic equivalent of a handwritten signature. There is more to it than pasting a graphic of a signature into a text document. Electronic signature software binds a signature, or other mark, to a specific document. Just as experts can detect a paper contract that was altered after it was signed, electronic signature software can detect the alteration of an electronically signed file any time in the future. CIC (www.cic.com) and Silanis Technology (www.silanis.com) are pioneers of electronic signature technology, which has proven especially relevant in the financial, insurance and real estate industries. An electronic signature is often confused with a "digital signature," because it uses digital signature technology for detection alteration. An electronic signature also requires user authentication such as a digital certificate, smart card or biometric method.

Download Here : Digital Signature.pdf Digital Signature.doc



Hyper-Threading Technology is a groundbreaking innovation
from Intel ® Corporation that enables multi-threaded software applications
to execute threads in parallel This level of threading technology has never
been seen before in a general-purpose microprocessor. Internet, e-Business,
and enterprise software applications continue to put higher demands on

To improve performance in the past, threading was enabled in
the software by splitting instructions into multiple streams so that multiple
processors could act upon them. Today with Hyper-Threading Technology,
processor-level threading can be utilized which offers more efficient use of
processor resources for greater parallelism and improved performance on
today's multi-threaded software. Hyper-Threading Technology provides
thread-level-parallelism (TLP) on each processor resulting in increased
utilization of processor execution resources. As a result, resource utilization
yields higher processing throughput. Hyper-Threading Technology is a form
of simultaneous multi-threading technology (SMT) where multiple threads
of software applications can be run simultaneously on one processor.

Download HERE : Hyper-Threading.pdf



In 1997, the IEEE ratified the 802.11 Wireless LAN standards, establishing a global standard for implementing and deploying Wireless LANS. The throughput for 802.11 is 2Mbps, which was well below the IEEE 802.3 Ethernet counterpart. Late in 1999, the IEEE ratified the 802.11b standard extension, which raised the throughput to 11 Mbps, making this extension more comparable to the wired equivalent. The 802.11b also supports the 2 Mbps data rate and operates on the 2.4GHz band in radio frequency for high-speed data communications


Voice Over Internet Protocol-VOIP

Using an ordinary phone for most people is a common daily occurrence as is listening to your favorite CD containing the digitally recorded music. It is only a small extension to these technologies in having your voice transmitted in data packets.
The transmission of voice in the phone network was done originally using an analog signal but this has been replaced in much of the world by digital networks. Although many of our phones are still analog, the network that carries that voice has become digital.In todays phone networks, the analog voice going into our analog phones is digitized as it enters the phone network. This digitization process, shown in Figure 1 below, records a sample of the loudness (voltage) of the signal at fixed intervals of time.
These digital voice samples travel through the network one byte at a time.
Figure 1. Digital Sampling of an analog voice signal At the destination phone line, the byte is put into a device that takes the voltage number and produces that voltage for the destination phone. Since the output signal is the
same as the input signal, we can understand what was originally spoken.
The evolution of that technology is to take numbers that represent the voltage and group them together in a data packet similar to the way computers send and receive information to the Internet. Voice over IP is the technology of taking units of sampled speech data .

Download : Voice Over Internet Protocol-VOIP.pdf

Voice morphing

Voice morphing means the transition of one speech signal into another. Like image morphing, speech morphing aims to preserve the shared characteristics of the starting and final signals, while generating a smooth transition between them. Speech morphing is analogous to image morphing. In image morphing the in-between images all show one face smoothly changing its shape and texture until it turns into the target face. It is this feature that a speech morph should possess. One speech signal should smoothly change into another, keeping the shared characteristics of the starting and ending signals but smoothly changing the other properties. The major properties of concern as far as a speech signal is concerned are its pitch and envelope information. These two reside in a convolved form in a speech signal. Hence some efficient method for extracting each of these is necessary. We have adopted an uncomplicated approach namely cepstral analysis to do the same. Pitch and formant information in each signal is extracted using the cepstral approach. Necessary processing to obtain the morphed speech signal include methods like Cross fading of envelope information, Dynamic Time Warping to match the major signal features (pitch) and Signal Re-estimation to convert the morphed speech signal back into the acoustic waveform.

This report has been subdivided into seven chapters. The second chapter gives an idea of the various processes involved in this project in a concise manner. A thorough analysis of the procedure used to accomplish morphing and the necessary theory involved is presented in an uncomplicated manner in the third chapter. Processes like pre processing, cepstral analysis, dynamic time warping and signal re-estimation are vividly described with necessary diagrams. The fourth chapter gives a deep insight into the actual morphing process. The conversion of the morphed signal into an acoustic waveform is dealt in detail in the fifth chapter. Chapter six summarizes the whole morphing process with the help of a block diagram. Chapter seven lists the conclusions that have been drawn from this project. 

Download Full report : Voice morphing.doc


The Intel MMX™ technology comprises a set of instructions to the
Intel architecture (IA) that are designed to greatly enhance the performance
of advanced media and communications applications. These extensions
(which include new registers, data types and instructions) are combined with
the Single Instruction, Multiple Data (SIMD) Execution model to accelerate
the performance of applications such as motion video, combined graphics
with video, image processing, audio synthesis, speech synthesis and
compression, 2D and 3D graphics, which typically use compute-intensive
algorithms to accomplish the purpose. All existing soft wares that don’t
make use of this technology will also run on the processor without
modification. Presented below is an elementary treatise on this technology in
a programmer’s point of view.



As more and more information is relayed over and stored on the internet, it becomes increasingly important to scrutinize and determine the identity of those who access that information. In the modern world, the importance of information security has attained an all time high. Authentication refers to the act of verifying the identity of an entity or an object. In the world of Information security, it refers to a method of reliably identifying a person / entity as authorized to access certain information. The process of authentication often checks certain characteristics of the claimant or information that he/she possesses to confirm genuineness. On a computer system this process presents several difficulties, which in turn limit us to using three main ways to authenticate humans –

Ø      Biometric Devices such as fingerprint analyzers or retinal scanners which directly identify – who a user is
Ø      Smart Cards and physical keys that can authenticate – what the user has
Ø      And Passwords which authenticate – what the user knows

Each of these techniques of authentication presents their own advantages and drawbacks. But because they are most cheap and convenient, Password authentication has become the most popular. There is hardly a computer system that does not rely on passwords to authenticate its users.

The object of my seminar is to present a holistic view of password based authentication systems and to analyze their problems. My special focus will be on dictionary attacks and methods to prevent dictionary attacks against password based authentication systems

Download Full Seminar Report Here :  INTERNET SECURITY .doc

Real-Time Systems

Real-Time systems are becoming pervasive. Typical examples of real-time systems include Air Traffic Control Systems, Networked Multimedia Systems, command Control Systems etc. In a Real-Time System the correctness of the system behavior depends not only on the logical results of the computations, but also on the physical instant at which these results are produced. Real-Time systems are classified from a number of viewpoints i.e. on factors outside the computer system and factors inside the computer system. Special emphasis is placed on hard and soft real-time systems. A missed deadline in hard real-time systems is catastrophic and in soft real-time systems it can lead to a significant loss. Hence predictability of the system behavior is the most important concern in these systems. Predictability is often achieved by either static or dynamic scheduling of real-time tasks to meet their deadlines. Static scheduling makes scheduling decisions at compile time and is off-line. Dynamic scheduling is online and uses schedulabilty test to determine whether a set of tasks can meet their deadlines. The present paper talks about static and dynamic scheduling algorithms and operating systems support for these mechanisms.

Download Full Seminar Report Here : Real-Time Systems.doc

NANO Technology

The advent of nanotechnology in cancer research couldn’t have come at a more opportune time. The vast knowledge of cancer genomics and proteomics emerging as a result of the Human Genome Project is providing critically important details of how cancer develops, which, in turn, creates new opportunities to attack the molecular underpinnings of cancer. However, scientists lack the technological innovations to turn promising molecular discoveries into benefits for cancer patients. It is here that nanotechnology can play a pivotal role, providing the technological power and tools that will enable those developing new diagnostics, therapeutics, and preventives to keep pace with today’s explosion in knowledge.
                          Nanotechnology provides the sized materials that can be synthesized and function in the same general size range and Biologic structures. Attempts are made to develop forms of anticancer therapeutics based on nanomaterials. Dendritic polymer nanodevices serves as a means for the detection of cancer cells, the identification of cancer signatures, and the targeted delivery of anti-cancer therapeutics (cis-platin, methotrexate, and taxol) and contrast agents to tumor cells. Initial studies documented the synthesis and function of a targeting module, several drug delivery components, and two imaging/contrast agents. Analytical techniques have been developed and used to confirm the structure of the device. Progress has been made on the specifically triggered release of the therapeutic agent within a tumor using high-energy lasers. The work to date has demonstrated the feasibility of the nano-device concept in actual cancer cells in vitro.

Download full seminar report Here : Nano Technology.doc


Billions of visible LEDs are produced each year, and the emergence of high brightness AlGaAs and AlInGaP devices has given rise to many new markets. The surprising growth of activity in, relatively old, LED technology has been spurred by the introduction of AlInGaP devices. Recently developed AlGaInN materials have led to the improvements in the performance of bluish-green LEDs, which have luminous efficacy peaks much higher than those for incandescent lamps. This advancement has led to the production of large-area full-color outdoors LED displays with diverse industrial applications.

The novel idea of this article is to modulate light waves from visible LEDs for communication purposes. This concurrent use of visible LEDs for simultaneous signaling and communication, called iLight, leads to many new and interesting applications and is based on the idea of fast switching of LEDs and the modulation visible-light waves for free-space communications. The feasibility of such approach has been examined and hardware has been implemented with experimental results. The implementation of an optical link has been carried out using an LED traffic-signal head as a transmitter. The LED traffic light (fig 1 below) can be used for either audio or data transmission.

Download full Seminar report Here : LED WIRELESS.pdf


Smart Cards are plastic credit cards devices with an integrated circuit chip with microprocessor. These smarts cards have the mechanism for storing and/or processing information. Intelligent Tokens (iButton) are high capacity general-purpose electronic data carriers, each with a unique registration number. They have the same components as smart cards but are shielded by a steel case. This paper will show the differences between contact smart cards, contact less smart cards and intelligent tokens (iButton) with respect to RAM, ROM and EEPROM and other related aspects.

Download Full Seminar Report Here : LATEST SMART CARD FEATURES

LASER Communication

LASER Communication

        Lasers have  been  considered  for  space  communications  since  their  realization  in  1960.Specific advancements were needed  in component performance and system engineering particularly  for  space  qualified  hardware.  Advances  in  system  architecture,  data formatting  and  component  technology  over  the  past  three  decades  have  made  laser communications  in  space  not  only  viable  but  also  an  attractive  approach  into  inter satellite link applications.

     Information  transfer  is  driving  the  requirements  to higher  data  rates,  laser  cross  -link technology  explosions,  global  development  activity,  increased  hardware,  and  design maturity. Most  important  in space  laser communications has been  the development of a reliable, high power, single mode  laser diode as a directly modulable  laser source. This technology advance offers the space laser communication system designer the flexibility to  design  very  lightweight,  high  bandwidth,  low-cost  communication  payloads  for
satellites whose  launch  costs  are  a  very  strong  function  of  launch weigh.  This  feature substantially reduces blockage of fields of view of most desirable areas on satellites. The smaller antennas with diameter  typically  less  than 30 centimeters create less momentum disturbance  to  any sensitive  satellite  sensors. Fewer on board  consumables are  required over the long lifetime because there are fewer disturbances to the satellite compared with heavier and larger RF systems. The narrow beam divergence affords interference free and secure operation.

Download full seminar report :  LASER Communication


In  today’s information  age  it  is  not  difficult  to  collect  data  about  an individual and  use that  information to  exercise  control  over  the  individual.Individuals  generally do not want  others  to  have personal  information  about them  unless  they  decide  to  reveal  it.  With  the  rapid  development  of technology, it is more difficult to maintain the  levels of privacy citizens knew in the  past.  In  this  context,  data  security  has  become an  inevitable  feature. Conventional  methods  of  identification  based on  possession  of ID cards  or
exclusive  knowledge  like  social  security  number  or  a  password  are  not altogether  reliable.  ID  cards  can  be  almost  lost,  forged  or  misplaced: passwords can  be  forgotten. Biometric  technology has  now  become  a viable alternative  to  traditional  identification  systems  because  of  its  tremendous accuracy and speed. This paper explores the concept of Iris recognition which is  one  of  the  most  popular  biometric  techniques.  This  technology finds applications in diverse fields.

Download full seminar report : IRIS SCAN.pdf

IPv4-IPv6 Transition Technology

    In the coming decade, we are said to confront the final frontiers of the mobile revolution. The 3G or the Third Generation mobile technology is expected to fulfil the idea of location and data independent communication. It is almost now clear that the packet switched (PS) technology will dominate the technological developments in the mobile sector. The packet switched technology would not only integrate the mobile infrastructure with the already existing Internet backbone, but also provide the facility of “always on“ connection. These two features are seen to be two great leaps forward in the direction of 3G. The Packet switched technology would not only enhance the existing data messaging services, but also provide alternate voice services through VOIP. The mobile vendors have planned long term strategies for 3G evolution. The most popular path is through GSM-GPRS-EDGE. This has allowed gradual transition, giving enough time for development, deployment and testing of better technologies.  


          The harnessing of electronics to measure odor is greatly to be desired. Human panels backed up by gas
chromatography and mass spectrometry are helpful in quantifying smells, but they time are consuming,
expensive and seldom performed in real time in the field. So it is important that these traditional methods give way to a speedier procedure using and electronic nose composed of gas sensors. Electronic nose or E-noses are the systems that detect and identify odours and vapours, typically linking chemical sensing devices with signal processing, pattern recognition and artificial intelligence techniques which enable uses to readily extract relevant and reliable information.

Download full Seminar report Here:


The term electronic commerce refers to any financial transaction involving the electronic transmission of information. The packets of information being transmitted are commonly called electronic tokens. One should not confuse the token, which is a sequence of bits, with the physical media used to store and transmit the information.
        We will refer to the storage medium as a card since it commonly takes the form of a wallet-sized card made of plastic or cardboard. (Two obvious examples are credit cards and ATM cards.) However, the "card" could also be, e.g., a computer memory.
          A particular kind of electronic commerce is that of electronic payment. An electronic payment protocol is a series of transactions, at the end of which a payment has been made, using a token issued by a third party. The most common example is that of credit cards when an electronic approval process is used. Note that our definition implies that neither payer nor payee issues the token

Download full seminar Report

DCOM Technical Overview

DCOM Technical Overview        

Microsoft® Distributed COM (DCOM) extends the Component Object Model (COM) to support communication among objects on different computers—on a LAN, a WAN, or even the Internet. With DCOM, your application can be distributed at locations that make the most sense to your customer and to the application.

              Because DCOM is a seamless evolution of COM, the world's leading component technology, you can take advantage of your existing investment in COM-based applications, components, tools, and knowledge to move into the world of standards-based distributed computing. As you do so, DCOM handles low-level details of network protocols so you can focus on your real business: providing great solutions to your customers.

Download Full Seminar Report:

DCOM Technical Overview.doc

DCOM Technical Overview.rar


A biochip is a collection of miniaturized test sites (micro arrays) arranged on a solid substrate that permits many tests to be performed at the same time in order to get higher throughput and speed. Typically, a biochip’s surface area is not longer than a fingernail. Like a computer chip that can perform millions of mathematical operation in one second, a biochip can perform thousands of biological operations, such as decoding genes, in a few seconds.
 A genetic biochip is designed to “freeze” into place the structures of many short strands of DNA (deoxyribonucleic acid), the basic chemical instruction that determines the characteristics of an organism. Effectively, it is used as a kind of “test tube” for real chemical samples.
 A specifically designed microscope can determine where the sample hybridized with DNA strands in the biochip. Biochips helped to dramatically increase the speed of the identification of the estimated 80,000 genes in human DNA, in the world wide research collaboration known as the Human Genome Project. The microchip is described as a sort of “word search” function that can quickly sequence DNA.
 In addition to genetic applications, the biochip is being used in toxicological, protein, and biochemical research. Biochips can also be used to rapidly detect chemical agents used in biological warfare so that defensive measures can be taken. Motorola, Hitachi, IBM, Texas Instruments have entered into the biochip business.
Download Here:

Automatic Vehicle Locator (AVL)

Is your car or a vehicle stolen or is it not visible in the thickest snow or is one among the several cars present? Do you wa nt to know the arrival of the bus for which you are waiting? Are your children going alone in a vehicle and you want to track their moments? Does your cargo consists of costly load and want to protect them? Do you want to keep track of your little playing kids about where they are?

ANS: Automatic Vehicle Locator.

This Paper gives us a novel approach of using certain GPS technology in tracking not only vehicles, but even children and to protect precious goods. So this technology has gained a lot of importance in the recent years. This paper tells us how this technology works, its applications. It is still under research and development stage.

Download here:
Automatic Vehicle Locator.pdf
Automatic Vehicle Locator.rar



   ATM  does  not  stand  for  automatic  teller  machine. In the telecommunication,  it stands  for  Asynchronous  Transfer  Mode,  in   which  data  sends  asynchronously.  This  mode  is  another  fast packet  switching  mode
       ATM  is  regarded  as  the  technology  of  the  21st  century  and  its impact  is  expected  to  be  similar  to  PCM  (pulse  code  modulation) which  is  used  widely  around  the  world  in  telecommunication.

                 Asynchronous  transfer  mode  (ATM)  is  a  technology  that  has  his  its  history  in  the  development  of  broadband  ISDN  in  the  1970s  and  980s.  Technically,  it  can  be  viewed  as  an  evolution  of  pocket switching.  Like  packet  switching  for  data , ATM integrates  the  multiplexing  and  switching  functions,  is  well  suited  for bursty  traffic  and  allows communications  between  devices  that  operate  at  different  speeds . Unlike  packet  switching , ATM  is  designed  for  high-performance multimedia  networking.

                   ATM  is  also  a  set  of  international  interface  and  signaling standards  defined  by  the  International  Telecommunication  Union- Telecommunications  (ITU-T)  Standards  Sector  (formerly  the  CCITT). The  ATM  forum  has  played  a  pivotal  role  in  the  ATM  market  since its  formulation  in  1991.
                   The  ATM  forum  is  an  international  voluntary  organization composed  of  vendors ,  service  providers,  research  organization,  and users.  Its  purpose  is  to  accelerate  the  use  of  ATM  products  and services  through  the  rapid  convergence  of  interoperability  specifications,  promotion  of  industry  cooperation ,  and  other  activities. Developing  multivendor  implementation  agreements  also  furthers  this goal.


Artificial neural network (ANNs)

Artificial neural network (ANNs) are programs designed to solve any problem by trying to mimic structure and function of our nervous system. Neural network are based on simulated neurons. Which are joined together in a variety of ways to form networks.

Many task which seem simple for us, such as reading a handwritten note or recognizing a face, are difficult task for even the most advanced computer. In an effort to increase the computer ability to perform such task, programmers began designing software to act more like the human brain, with its neurons and synaptic connections. Thus the field of “Artificial neural network” was born. Rather than employ the traditional method of one central processor (a Pentium) to carry out many instructions one at a time, the Artificial neural network software analyzes data by passing it through several simulated processos which are interconnected with synaptic like “weight” 

                   Once we have collected several record of the data we wish to analyze, the network will run through them and “learn ” the input of each record may be related to the result. After training on a few doesn’t cases the network begin to organize and refines its on own architecture to feed the data to much the human brain; learn from example.

          This reverse engineering technology were once regarded as the best kept secret of large corporate, government an academic researchers.

          The field of neural network was pioneered by BERNARD WIDROW of Stanford University in 1950’s.