+ All Categories
Transcript
Page 1: Audio Video Conferencing System

A.D.Patel Institute of Technology

Project Report on

AUDIO VIDEO Conference SystemA REPORT SUBMITTED IN PARTIAL FULFILLMENT FOR

THE DEGREE OF BACHELOR OF ENGINEERING IN

COMPUTER ENGINEERING

BY

PATEL VISHWA G. (06CP075)

SOLANKI PRAKASH R. (06CP015)

Guided By

Mr. TEJAS VASAVADA

B.E. SEMESTER VIII (COMPUTER ENGINEERING)

YEAR: 2010

A.D.PATEL INSTITUTE OF TECHNOLOGY

NEW V.V.NAGAR (GUJARAT – INDIA).

SARDAR PATEL UNIVERSITY

VALLABH VIDYANAGAR

1

Page 2: Audio Video Conferencing System

A.D.Patel Institute of Technology

A. D. Patel Institute of TechnologyDepartment of Computer Engineering

New V. V. Nagar

CERTIFICATE

This is to certify that PATEL VISHWA(06CP075), and SOLANKI

PRAKASH (06CP015) of final year Computer Engineering have satisfactory

completed their project work entitled “AUDIO VIDEO CONFERENCE

SYSTEM” in the academic year 2009-10 for the partial fulfillment of the

award of the Bachelor of Engineering in Computer Engineering at Sardar

Patel University, Vallabh Vidyanagar.

Date:

Project Guide Head of Department

Mr. Tejas Vasavada Mr. Ramji Makwana

Principal

Dr. R. K. Jain

2

Page 3: Audio Video Conferencing System

A.D.Patel Institute of Technology

ACKNOWLEDGEMENT

We take this opportunity to express our gratitude to few individuals who guided us in every aspect of our project. Without their support and guidance the project work might not be completed.

First and foremost, we would like to thank Our Principal Mr. R.K.Jain to give good guidance in sharing his expertise, and for believing in the concept and viability of this project.

We also kindly express our great gratitude to our Head of Department (CP) Mr. Ramji Makwana for providing us lab facility for completion of our project.

We express our heart field gratitude to our Project Guide Mr. Tejas Vasavada to give support in project and its basics by his timely and directive help. We learned lot more from him.

Finally, we would like to thank all the lab staff and special thanks to and Mr. Shirish Patel for providing us software and administrative authorities for our project.

3

Page 4: Audio Video Conferencing System

A.D.Patel Institute of Technology

Content:-

1. Project Abstract 61.1 Project Definition 61.2 Project Introduction 6

2. Requirement Analysis 82.1 Requirement Elicitation 82.2 Requirement Determination 9

2.2.1 Requirement Clarification 92.2.2 Requirement Approval 92.2.3 Request Approval 10

2.3 Feasibility Analysis 112.3.1 Operational feasibility 112.3.2 Technical feasibility 122.3.3 Economical feasibility 132.3.4 Basic Software and Hardware required 14

3. Technical literature and review3.1 Java Sockets 153.2 Technical background 163.3 Java Media Framework 16

3.3.1 J.M.F Architecture 193.3.2 Streaming media 19

3.3.3 RTP Services20

3.3.4 Principal elements 20

4. System Design 244.1 Context Diagram 254.2 Client connection to server 26

5. System Implementation 295.1 Application architecture and overview 305.2 Server Implementation 32

4

Page 5: Audio Video Conferencing System

A.D.Patel Institute of Technology

6. Implementation Details 34

7. Implementation Results 35

8. Future Enhancements and Conclusion 388.1 Limitation 388.2 Future Enhancements 388.3 Conclusion 39

9. References 409.1 Books 409.2 Websites links 40

5

Page 6: Audio Video Conferencing System

Chapter 1 AbstractChapter 1 AbstractChapter 1 AbstractChapter 1 Abstract

A.D.Patel Institute of Technology

Project definition:

To design and develop an Audio video conferencing system that uses RTP protocol for real time

data transmission across LAN using JMF API.

In this project we have gone through different stages. In requirement analysis stage we have

determine what a project must do, here in our project transmission of real time data is done so for

that we have used java media frame work API. Here in our project we have used NetBeans 6.5

for developing our system in Java language.

Our system is client server application here client send request to the server for connection, after

authentication is done by server different clients can communicate with each other. Our system

provide facility of message chatting, file sharing, audio video conferencing. In this report We

have included all detailed information about our project.

.

6

Page 7: Audio Video Conferencing System

A.D.Patel Institute of Technology

To design and develop a video conferencing program that uses RTP protocol for real time data transmission across LAN network using JMF

API. A Video conferencing system is a system that handles real time data transmission across

network. It uses Real time data transmission protocol to transfer data over the network. It also provides some features like media buffering at receiver and at transmitter.

A video has many formats according to connected devices, but this program automatically identifies video device attached and directly sets the default resolution of the device. Similarly it also finds audio capture devices and uses default sound-card for output audio.

Any video conferencing terminal must have a few basic components to "get the job done": a camera (to capture local video), a video display (to display remote video), a microphone (to capture local audio), and speakers (to play remote audio.) In addition to these more obvious components, a videoconferencing terminal also includes a codec ("COmpressor/DECompressor"), a user interface, a computer system to run on, and a network connection. Each of these components plays a key role in determining the quality, reliability, and user-friendliness of the videoconferencing experience as well as any given videoconferencing terminal's suitability to particular purposes.

Within a videoconference audio is as important, and often considered more important, than video. If we lose video or experience poor video quality in a conference but audio remains intact, we can still accomplish many of our communication objectives.

7

1.2. PROJECT INTRODUCTION

Page 8: Audio Video Conferencing System

A.D.Patel Institute of Technology

Requirement analysis stage decides what project must do and what its expectations are. To develop a project, one must have a reason for the project to be made. This stage is aimed at determining the reasons for project development and collecting enough information for system analysis and design.

The information about the requirements of the proposed system and the requirements to be fulfilled by the system are to be discovered. For the same the following were done.

Identifying users:

o This application is useful to any java developer, but it is used by

Administrator for monitoring purpose, Engineers and supervisors to analyze different areas of the classroom or lab or any such environment.

8

Chapter 2: Requirement Analysis

2.1 Requirement Elicitation

Page 9: Audio Video Conferencing System

A.D.Patel Institute of Technology

Many protocols are there those handles transfer of data, and hence many different similar purpose systems are available in market. For example windows Netmetting is such software that provides all the basic functionalities. Yahoo messenger beta also provides video conferencing. Windows messenger also provides such a support.

There are some examples of software based phone systems available. One example is Skype, an internet phone system. This allows users to have voice conversations, free of charge, over the internet, provided that the party they are calling is also using the Skype service. The disadvantage is that a company employing this system would have no control over their users. Another example is Vonage, which offers the same sort of service as Skype and hence the same disadvantages.

2.2.1 Requirement Specification

Requirement specification is the activity of translating the information gathered during the analysis activity into a document that defines a set of requirements. Two types of requirements may be included in this document. User requirements are abstract statements of the system requirements for the customer and end user of the system. System requirements are a more detailed description of the functionality to be provided.

With reference to the above discussion, the problem definition the system has to provide the following functionality:

Design software that provides communication between two remote users connected via some kind of interconnection network and data transfer is in form of video, audio or in any combination of these two.

2.2.2 Requirement clarificationTo clarify the things, what I have understood and what the expectations were, this stage comes in focus. Request clarification was carried out to get exact view of system to be developed. To get the revised prioritized requirement specification, it was carried out with my project guide.

9

2.2 Requirement Determination

Page 10: Audio Video Conferencing System

A.D.Patel Institute of Technology

Project Guide

The project guide clarified me basic strategy that I should follow to develop system. He advised to divide the project into modules and to complete each module at specific time interval.

Outcome

Key points that come up, important for system design were the following:

Powerful: Reducing function calling overhead by maximum use of run-time polymorphism. Creating thread for each new request upcoming from client. All processor starts their work in thread only. Achieving maximum degree of process level parallelism.

Flexible: The application must run in any java compatible machine and minimizing memory usage. Similarly data transfer through network should be minimum, so law burden on network.

Advanced: Creating a media player that is capable of playing some frequent media file formats and Video recording with audio without any delay in-between them.

View: Providing a windows XP look for each user input forms.

10

Page 11: Audio Video Conferencing System

A.D.Patel Institute of Technology

2.2.3 Request Approval

The Project guide approved the project requirement specifications identified by us, permitted us to go ahead for the next phases of the system analysis and development. He gave few technical suggestions for development and guidelines for further analysis. He told us to make an elaborate project schedule for tracking work done. Few important points that were dictated by him can be abridged as :

After every analysis we must approach our project guide for difficulties and final approval of our work before going any further.

Any help needed will be promptly available as and when required.

Following three kinds of feasibilities, I have reviewed.

Operational Feasibility

Technical Feasibility

Economical Feasibility

The proposed project – ‘Video Conferencing System’ is best fitted for company manager, students, teachers, LAN administrators, programmers, or anyone who are interested in real time data transmission in the LAN.

11

2.3 Feasibility Analysis

2.3.1 Operational Feasibility

Page 12: Audio Video Conferencing System

A.D.Patel Institute of Technology

Programmer

When user develops a network program, user may need to exactly know what it has sent and received. Our System provides a way to know everything on data transmitted through network. Our system converts all data to a particular object and sends it to server. Server has authority to decode it and access it. User can send its login request in form of User object. It writes appropriate message string in user field called data. Server decodes it and performs appropriate function. Hence our system uses Serialization, all objects can migrate from one JVM to next.

Network Administrator

Network Administrator can use this system to send message to all terminals connected in LAN.

To monitor lab activities during the day while sitting at his terminal.

Making a chat to any other lab in campus or making a video call.

System is developed using an Application Development tool like Netbeans IDE with JMF 2.1.1e API. This technology is best suited for the System. So, it provides the best facility to manage the code and helps programmer technically. Netbeans is mainly used to reduce the development time and manage the user interaction more efficiently and comfortably.

This is the program that was used to code and compile all of the Java code. The reason that this program was chosen is that it was available for free and it was very straightforward to use. It was simple to use but it did the job. One of the features that was very helpful in this program was that it highlighted any common coding errors, which saved a lot of time. In other situations, the developer may not have been informed of these errors until after compilation.

System requires nothing for message chatting, it never requires all the components for its working. Module wise implementation gives it the benefit of limiting some

12

2.3.2 Technical Feasibility

Page 13: Audio Video Conferencing System

A.D.Patel Institute of Technology

features while the related hardware is unavailable. So it runs at minimum operating cost.

Our system deals with the latest technology for development. Netbeans 5.5.1 is much more than a compiler. It is a complete application development environment that, when used as intended, lets you fully exploit the object oriented nature of java to create professional Windows applications. JMF is a free public API for direct access to real time data under windows using jdk. It is This means system is best in form of economical point not only due to technological reasons but also by its operational reasons.

Software IntentJDK1.6.0 For Programming

Net Beans 5.1 & Notepad As Editing Tool

Java Media Framework API for accessing real time data.

13

2.3.3 Economical Feasibility

Basic Hardware and software required

Page 14: Audio Video Conferencing System

A.D.Patel Institute of Technology

Hardware Intent

PC with any OS To Provide Client Server Interface Through Computers Connected with LAN.

Java Development Toolkit 1.6 As a java platform JVM .

128 MB RAM or higher For fast Processing.

10 GB Hard Disk For Storage.

14

Page 15: Audio Video Conferencing System

A.D.Patel Institute of Technology

15

Chapter 3: Technology and literature review

3.1 Java

3.2 Technical Background

3.3 Java Media Framework

Page 16: Audio Video Conferencing System

A.D.Patel Institute of Technology

One of the java features that allows communicating between two computers or if we want to make a client server application is java sockets. Sockets are an innovation that allows the programmer to treat a network connection as just another stream onto which bytes can be written and from which bytes can be read. Sockets shield the programmer from low-level details of the network, such as media types, packet sizes, packet retransmission, network addresses and more. Java’s socket class which is used by both clients and servers has methods that correspond to the features like connect to a remote machine, receive and send data, close a connection, listen for incoming data on some port and accept connections from remote machines on the bound port.

In this chapter, the various standards used in the design of this system will be discussed. The standards chosen were based on what was supported by the Java Media Framework. There were possibly some more suitable options out there, for example with the encoding schemes, but the choice was limited by what was supported by the Java Media Framework and the Real-Time Transport Protocol. The standards discussed within this chapter were the basic building blocks that this project was built on.

It is often the case that a Java developer will want to include some real-time media within their Java application or applet. Prime examples of such real-time media would be audio and video. The Java Media Framework (JMF) was developed to enable this to happen. JMF allows the capture, playback, streaming and transcoding of multiple media formats.

JMF is an extension of the Java platform that provides a powerful toolkit from which scalable, cross platform applications can be developed. Any data that changes with respect to time can be characterized as real-time media.

16

3.1 Java sockets

3.2 Technical Background

3.3 Java Media Framework

Page 17: Audio Video Conferencing System

A.D.Patel Institute of Technology

With real-time media, the idea is that you will see it as it happens. So for example, if you are partaking in a video conference, you expect that there should not be a significant delay between when the other person says something to you, and when you hear and see them saying it. Audio clips, MIDI sequences, movie clips, and animations are common forms of time-based media. Such media data can be obtained from a variety of sources, such as local or network files, cameras, microphones, and live broadcasts. Figure 2.1, below, shows a media processing model.

There are three main elements within the system - the input, the output and the processor. Think of the input as where the data comes from, this could be a capture device such as a video camera, a file or it could be data that has been received over a network. Before the input can reach the output, it has to be formatted so that it can be received correctly. This formatting takes place in the processor.

A processor can do many things to the data, some of which include compressing/decompressing, applying effect filters and converting into the correct format using the encoding scheme which has been specified. Once the data has been correctly formatted by the processor, it is then passed on to the output so that the end user can see or hear it.

The output could simply be a player, such as a speaker or a television, it could save the data to a file or it could send it across the network.

Figure 3.3.1 - Media Processing Model

To relate the media processor model shown above to this particular project, let us take a look at Figure 2.1. As can be seen immediately, this system has more components than the one shown above. However it can still be divided into the same three parts, input, processor and output. The input consists of the MediaLocator which represents the address of the device, and the data source, which is constructed using the MediaLocator and is the interface to the device.

17

Page 18: Audio Video Conferencing System

A.D.Patel Institute of Technology

The data is then taken from the input and sent to the processor. The processor in the system consists of the processor itself, which takes the data and converts it into the encoding scheme that has been defined for the system. The other element of the processor is the RTPManager.

The transmission RTPManager takes the encoded data from the processor and packetizes it, so that it can be sent over the network. The data is then transmitted over the network where it is met on the other side by the receiver RTPManager, which takes the data and depacketizes it, converting it back into a format that can be read by the player. Once this stage has been completed, the data is passed to the output, consisting of the player and the speaker (the example shown here is for a voice call, the speaker could be a monitor or any other sort of output device that the media can be seen or heard on).

The player takes the encoded data and decodes it, then sends it to the output device so that the receiver can see or hear it.

Manager handles the construction of Players, Processors, DataSources, and DataSinks. This level of indirection allows new implementations to be integrated seamlessly with JMF. From the client perspective, these objects are always created the same way whether the requested object is constructed from a default implementation or a custom one.

PackageManager maintains a registry of packages that contain JMF classes, such as custom Players, Processors, DataSources, and DataSinks.

CaptureDeviceManager maintains a registry of available capture devices.

PlugIn Manager maintains a registry of available JMF plug-in processing components, such as Multiplexers, Demultiplexers, Codec, Effects, and Renderers.

18

Page 19: Audio Video Conferencing System

A.D.Patel Institute of Technology

3.3.1 : JMF Architecture

The most practical example of real-time media comes from a basic home movie system. Imagine someone is making a home movie, the first thing that they do is record it onto a video tape using a camcorder. So they are using a capture device – the camcorder – and recording onto a data source – the video tape. Once they have made the movie, the next logical thing that they would want to do would be to watch it. So, thinking of the system processing model, they would need some sort of processor that would take the data from the data source and convert into some format that they can see and hear. This processor would be a VCR. When the data source is placed into the processor, the data is transmitted to the final stage of the system processing model – the output. In this case, the television will be the principle output device. There will more than likely be speakers on the television that will transmit the audio part of the media. So below we have a very basic processing model that many people use every day at home.

3.3.2 Streaming Media

When media content is streamed to a client in real-time, the client can begin to play the stream without having to wait for the complete stream to download. In fact, the stream might not even have a preened duration downloading the entire stream before playing it would be impossible. The term streaming media is often used to refer to both this technique of delivering content over the network in real-time and the real-time media content that delivered. Streaming media is everywhere you look on the web live radio and television broadcasts and webcast concerts and events are being offered by a rapidly growing number of web portals, and its now possible to conduct audio and video conferences over the Internet.

19

Page 20: Audio Video Conferencing System

A.D.Patel Institute of Technology

3.3.3 RTP Services

RTP enables you to identify the type of data being transmitted, determine what order the packets of data should be presented in, and synchronize media streams from different sources. RTP data packets are not guaranteed to arrive in the order that they were sent in fact, the are not guaranteed to arrive at all. Its up to the receiver to reconstruct the sender packet sequence and detect lost packets using the information provided in the packet header. While RTP does not provide any mechanism to ensure timely delivery or provide other quality of service guarantees, it is augmented by a control protocol (RTCP) that enables you to monitor the quality of the data distribution. RTCP also provides control and identification mechanisms for RTP transmissions. If quality of service is essential for a particular application, RTP can be used over a resource reservation protocol that provides connection-oriented services.

3.3.4 Principle Elements

Data Source

In JMF, a DataSource is the audio or media source, or possibly a combination of the two e.g. a webcam with an integrated microphone. It could also be an incoming stream across a network, for example the internet, or a file. Once the location or protocol of the data is determined, the data source encapsulates both the media location, and the protocol and software used to deliver the media. When a DataSource is sent to a Player, the Player is unconcerned about the origin of the DataSource.

There are two types of DataSources, determined by how the data transfer initiates: Pull data source: Here the data flow is initiated by the client and the data

flow from the source is controlled by the client. Push data source: Here the data flow is initiated by the server and the data

flow from the source is controlled by the server.

Several data sources can be combined into one. So if you are capturing a live scene with two data sources: audio and video, these can be combined for easier control.

20

Page 21: Audio Video Conferencing System

A.D.Patel Institute of Technology

Capture Device

A capture device is the piece of hardware that you would use to capture the data, which you would connect to the DataSource. Examples would be a microphone or a webcam.

The captured media can then be sent to the Player, converted into another format or even stored to be used at a later stage. Like DataSources, capture devices can be either a push or a pull source.

If a capture device is a pull source, then the user controls when to capture the image, if it is a push source, then the user has no control over when the data is captured, it will be captured continuously.

Player As mentioned above, a Player takes a stream of data and renders it to an output

device. A Player can be in any one of a number of states. Usually, a Player would go from one state to the next until it reaches the final state. The reason for these states is so the data can be prepared before it is played. JMF defines the following six states for the Player:

Unrealized: In this state, the Player object has just been instantiated and does not yet know anything about its media.

Realizing: A Player moves from the unrealized state to the realizing state when the Player's realize() method is called. In this state, the Player is in the process of determining its resource requirements.

Realized: Transitioning from the realizing state, the Player comes into the realized state. In this state the Player knows what resources it needs and has information about the type of media it is to present. It can also provide visual components and controls, and its connections to other objects in the system are in place. A player is often created already in this state, using the createRealizedPlayer() method.

Prefetching: When the prefetch() method is called, a Player moves from the realized state into the prefetching state. A prefetching Player is preparing to present its media. During this phase, the Player preloads its media data, obtains exclusive-use resources, and does whatever else is needed to play the media data.

Prefetched: The state where the Player has finished prefetching media data it's ready to start.

21

Page 22: Audio Video Conferencing System

A.D.Patel Institute of Technology

Processor

A Processor is a type of Player, which has added control over what processing is performed on the input media stream. As well as the six aforementioned Player states, a Processor includes two additional states that occur before the Processor enters the realizing state but after the unrealized state:

Configuring: A Processor enters the configuring state from the unrealized state when the configure() method is called. A Processor exists in the configuring state when it connects to the DataSource, demultiplexes the input stream, and accesses information about the format of the input data.

Configured: From the configuring state, a Processor moves into the configured state when it is connected to the DataSource and the data format has been determined. As with a Player, a Processor transitions to the realized state when the realize() method is called.

DataSink

The DataSink is a base interface for objects that read media content delivered by a DataSource and render the media to some destination, typically a file.

Format

A Format object represents an object's exact media format. The format itself carries no encoding-specific parameters or global-timing information; it describes the format's encoding name and the type of data the format requires. Format subclasses include,

AudioFormat VideoFormat

In turn, VideoFormat contains six direct subclasses: H261Format H263Format IndexedColorFormat JPEGFormat RGBFormat YUVFormat

22

Page 23: Audio Video Conferencing System

A.D.Patel Institute of Technology

Manager

A manager, an intermediary object, integrates implementations of key interfaces that can be used seamlessly with existing classes. JMF offers four managers:

Manager: Use Manager to create Players, Processors, DataSources, and DataSinks.

PackageManager: This manager maintains a registry of packages that contain JMF classes, such as custom Players, Processors, DataSources, and DataSinks.

CaptureDeviceManager: This manager maintains a registry of available capture devices.

PlugInManager: This manager maintains a registry of available JMF plug-in processing components.

23

Page 24: Audio Video Conferencing System

A.D.Patel Institute of Technology

24

Chapter 4: System Design

4.1 CONTEXT DIAGRAM OF SERVER

4.2 Client connection to Server

Page 25: Audio Video Conferencing System

A.D.Patel Institute of Technology

The system design phase moulds analyzed requirements into shape.

This aims at developing the designs and diagrams which can be transformed into implementation.

There come various diagrams which explain different aspects of the design. These diagrams are by themselves self explanatory.

25

4.1 CONTEXT DIAGRAM OF SERVER

Page 26: Audio Video Conferencing System

A.D.Patel Institute of Technology

Each time client opens application, it is asked for connection to server. Client has to enter server IP address or URL and has to enter username. Client binds this information into Object and sends it to server.

Server that is running on a specific port receives the object and identifies it by decoding its id. Server maintains two data structures which maintains list of currently connected clients. If client disconnects, then he must have to authenticate again with server.

All clients must have a unique username value. Server checks each client username it receives and checks that its username is unique. If user already exists with same name, Server sends appropriate message to client.

26

4.2 Client connection to Server

Page 27: Audio Video Conferencing System

A.D.Patel Institute of Technology

Custom Object Design

Master class is extended by all other classes. All three classes extends master and also defines their own internal methods and data types.

27

Page 28: Audio Video Conferencing System

A.D.Patel Institute of Technology

Sequence Diagram

28

Client creates a request by sending

data to server

SERVER

Client sends message or file

Sends response to specified user

Server sends same object back for authentication

USER

Page 29: Audio Video Conferencing System

A.D.Patel Institute of Technology

29

Chapter 5: System Implementation

5.1 Application Architecture and Overview

5.2 SERVER IMPLEMENTATION

Page 30: Audio Video Conferencing System

A.D.Patel Institute of Technology

System is implemented in pure object oriented programming language JAVA.

For GUI implementation, Netbeans IDE is used. A JMF API gives best support to real time data over network.

Our application is based on pure object-oriented model. It uses JMF API for real time data transmission across LAN network. For using the application it must be connected with the server. Server authenticates the application and sends it a response packet to client. Thus some level authorization is performed at server side.

There are several features of our application, all are listed below

A GUI server.

Multithreading environment at each level.

Maintaining user logon and logoff requests.

Encoding and decoding data sent by client in form of Object.

Windows Theme for each form.

30

5.1 Application Architecture and Overview

Page 31: Audio Video Conferencing System

A.D.Patel Institute of Technology

All necessary inputs or Critical inputs whose value depends on performance are given at run-time by user. This feature is very useful for debugging purpose.

Run-time polymorphism to improve performance and to reduce overhead on procedure or function call.

A GUI player which gives all the information related to upcoming data from remote terminal.

An excellent transmitter which transmits both audio and video data to given terminal.

Audio and Video recording in .avi and .mov formats with so many choices.

A GUI receiver and GUI transmitter which sends data to different terminals. For each transmission new Thread is created, so system never hangs.

All exceptions are recoverable and hence system will not be inconsistence in any type of situations and will be reported accordingly to user.

File transferring between two terminals.

Message chatting between all users at the same time. All user can send message to all others and also can send private message to particular user.

Separate server application that does not depends on client for any resource and same with client.

Model-View-Controller architecture is used for implementation.

It requires JMF installed on every PC is runs.

Audio and Video Buffering.

31

Page 32: Audio Video Conferencing System

A.D.Patel Institute of Technology

32

Page 33: Audio Video Conferencing System

A.D.Patel Institute of Technology

33

LaunchServer

ServerListener ClientListener ClientListenListener

public void onServerRunning(boolean enable);

public void onShutdown();

public void onLogging(boolean enable); public void onLogFileClear();

public void onViewLogFile();

public void onShowConnectedUsers();

public void onListen();

public void onClose();

public void onNewConnection(Socket s);

public void onListenError(String description);

public void onUserLeave(ClientListenThread c);

public void onUserAuthenticate(ClientListenThread c, User user);

public void onUserSendMessage(ClientListenThread c, Message msg);

public void onSendControl(ClientListenThread c);

public void onUserSendFile(ClientListenThread c,FileObject flo);

ClinetThreadServerThread ClinetListenThread

5.2 SERVER IMPLEMENTATION

Page 34: Audio Video Conferencing System

A.D.Patel Institute of Technology

Server runs three thread from main class LaunchServer.java.

ServerThread handles server running and handling events from server administrator.

ClientThread handles all new coming request to server and allocates their input-output streams.

ClinetListenThread handles users request like message sending file sending, private message sending etc.

All real time data transfer is done without use of server because by doing so we can faster data transfer by twice because data transfer is done between only two terminals instead of through server. Thus all real-time data transfer is done without even touching server.

34

Page 35: Audio Video Conferencing System

A.D.Patel Institute of Technology

We have used a Model-view-controller architecture for the implementation.

A view is created and represented to user which gives user entered data to controller.

Controller calls appropriate model class to handle action and gives result to controller.

User will directly call Controller program for execution but due to runtime polymorphism, user can cot predict which version of process will be called. Thus it gives high performance and low overhead.

Majority of program files are extending interface Runnable and runs in their own thread. So when controller transfers control to another file it does not have to wait for its completion because its execution is started by Thread in parellal.

So, User can perform multiple activities without any trouble.

All GUI forms and player handling classes calls dispose method when they go out of scope. So that it is easy to reassign them in process without damaging main process thread.

We had found that it is too hard to maintain data structures at Server about active clients. So it is somewhat less efficient but still it is consistent.

35

Chapter 6: Implementation Details

Page 36: Audio Video Conferencing System

A.D.Patel Institute of Technology

Implementation result

Here first form is the GUI server that handles all coming request and maintains list of connected clients. Administrator must start server to accept request from client. A server user can see connected clients, log file that writes server activity to log file. Enable logging option lets user to specify weather to log action or not. Finally, there is a exit option that is only way to shutdown a server. When server shuts down, all clients are disconnected and their login form is displayed.

36

Chapter 7: Implementation result

Page 37: Audio Video Conferencing System

A.D.Patel Institute of Technology

Second form is login form that accepts unique username and IP and port of server.

After successful login, this window is displayed to user. Which has various options that satisfies user request.

37

Page 38: Audio Video Conferencing System

A.D.Patel Institute of Technology

Message:

o When user clicks on Message, a form on next page is shown to user. It has a list

of all connected clients. When a client types a simple message and presses send button, client process will create an object of type message and send it all connected clients. User can see his message received back to him by server on its terminal window text area.

o User can also send a private message to a particular user just by double clicking

different user name on right side of its window.

Private message will only displayed at user’s private message window only.

It is automatically activated at other site when one user send a message.

File Sharing

o When user performs this action, it will be asked for selecting a file from the its

local memory. After selecting a file user will be asked for entering buffer size that determines file read block size and creates link list and binds it to object.

o At the other end,the user who receives a file will be asked for weather he wants to

receive file or not, and has to specify location for coming file.

38

Page 39: Audio Video Conferencing System

A.D.Patel Institute of Technology

8.1 LIMITATIONS

As the software is an integration of various components, the failure of any one of this may lead to a failure. Although recovery strategies are provided, a failure can not be prevented.

Some of the limitations may include:

The system may crash if it runs out of memory.

As this is a research based project it contains some flaws and loop holes.

As it performs various operation of Player and Processor, it is quite possible that a deadlock situation or a long amount time of a wait to recover from function. We have tried to remove this by program logic and in most cases there is no delay experienced by user due to multithreading.

This application can not run without server, so if server goes down no one can work or no one can use any feature of program that is not related to server.

8.2 FUTURE ENHANCEMENTS As a developer of the entire application we would have been glad to add the following

features if time was not a factor:

Find and remove un-used methods and variables from program.

Implement Codec to improve quality of the transfer.

39

8. FUTURE ENHANCEMENTS & CONCLUSION

Page 40: Audio Video Conferencing System

A.D.Patel Institute of Technology

Implementing Codecchains to apply more than one codec on stream of data and further improvement in transmission.

Use of RTP API can enhance this application a lot.

Create real-time streaming of file.

Create a better video transferring .

8.3 CONCLUSION

By successful completion of this project, we can implement video conferencing system on any network that reduces the amount of data transfer and also gives better quality of real time media data. Further this application can be used also some basic communication Utilities like message and voice over IP network, recording of audio-video, player application, a transmitting application gives it much better flexibility by hiding its inefficiency in some features.

40

Page 41: Audio Video Conferencing System

A.D.Patel Institute of Technology

Books :

Java Network Programming. by-Elliotte Rusty Horold.

Complete Reference JAVA.by-Herbert Schildt.

Java I/O by O’Relly publication.

Websites :

http://www.cs.technion.ac.il/~cshenig/project/ http://www.javacoffeebreak.com http://www.netbeans.org http://www.sun.java.com

41

References


Top Related