Wednesday, June 25, 2008

ugpg project report format

All the students of final year M.TECH/B.TECH/MCA CSE/IT are required to follow these guidelines enclosed, scrupulously.

Any deviation from this in whatever form, shall not be accepted. Such deviation will force you to resubmit your project thesis.

Read the instructions thoroughly before

you start writing the project thesis.

It is the responsibility of each one of you in the batch

to see that the work completed is presented in

a neat, logical, lucid and coherent way.

The Candidate shall supply a typed copy of the

manuscript to the guide for the purpose of approval.

Without prior approval the submission is not allowed.

(A Typical Specimen of Title page)

<>

A PROJECT REPORT

Submitted in partial fulfillment of requirements to

JAWAHARAL NEHRU TECHNOLOGICAL UNIVERSITY

For the award of the degree

B.Tech in CSE

By

OCT 2008

CMR INSTITUTE OF TECHNOLOGY

(Approved by A.I.C.TE)

(Affiliated to Jawaharlal Nehru Technological University)

Kandlakoya:: Hyderabad
HYDERABAD - 501401


1. GENERAL:

The manual is intended to provide broad guidelines to the : M.Tech/B.Tech/M.C.A candidates in the preparation of the project. In general, the project shall report, in an organized and scholarly fashion, an account of original research work of the project team leading to the discovery of new facts or techniques or correction of facts already known (analytical, experimental, hardware oriented etc.)

2. NUMBER OF COPIES TO BE SUBMITTED: M.Tech/B.Tech/M.C.A. : ONLY ONE copy for the department / college library and ONE copy for the guide if so he requires, what ever you say.

3. ARRANGEMENT OF CONTENTS OF PROJECT:

The sequence in which the project material should be arranged and bound should be as follows:

1. Title page

2. Certificate from Organization where the Project Work done

3. Bona fide Certificate

4. Abstract

5. Acknowledgement

6. Table of Contents

7. List of Tables

8. List of Figures

9. List of Symbols, Abbreviations or Nomenclature (Optional)

10. Chapters

11. Appendices

12. References

The Tables and Figures shall be introduced in the appropriate places.

4. PAGE DIMENSIONS AND MARGIN:

The dimensions of the project report should be 290mm X 205mm. Standard A4 size (297mmX21 0mm) paper may be used for preparing the copies.

The final project copies (at the time of submission) should have the following page margins:

Top edge : 30 to 35 mm

Bottom edge: 25 to 30mm

Left side : 35 to 40 mm

Right side : 20 to 25 mm

The project should be prepared on good quality white paper preferably not lower than 80gms/Sq.Meter.

Tables and figures should conform to the margin specifications. Large size figures should be photographically or otherwise reduced to the appropriate size before insertion.

5 . MANUSCRIPT PREPARATION:

The Candidate shall supply a typed copy of the manuscript to the guide for the purpose of approval. In the preparation of the manuscript, care should be taken to ensure that all textual matter is typewritten to the extent possible in the same format as may be required for the final project.

The headings of all items 2 to 12 listed in section 3 should be typed in capital letters without punctuation and centered 50mm below the top of the page. The text should commence 4 spaces below this heading. The page numbering for all items 1 to 8 should be done using lower case Roman numerals and the pages thereafter should be numbered using Arabic numerals.

5.1 Title page - A specimen copy of the title page for B.Tech/ M.C.A. Project is given already.

5.2 Certificate from Organization where the Project Work done

5.3 Bona fide Certificate - using double spacing for typing the Certificate should be in this format:

Certified that this project work titled …………………………………….. is the bona fide work of Mr./Ms ……………………………. who carried out the work under my supervision, and submitted in partial fulfillment of the requirements for the award of the degree, BACHELOR OF TECHNOLOGY IN COMPUTER SCIENCE & ENGINEERING , during the year 2006 - 2007.

(Dr.XXXXXXXXXXX)

Prof. & Head

The certificate to be countersigned by the HOD.

5.4 Abstract - Abstract should be an essay type of narrative not exceeding 600 words, outlining the problem, the methodology used for tackling it and a summary of the findings.

5.5 Acknowledgement - It should be brief and should not exceed one page when typed double spacing.

5.6 Table of Contents - The table of contents should list all material following it as well as any material which precedes it. The title page, Bonafide Certificate and Acknowledgement will not find a place among the items listed in the Table or Contents but the page numbers of which are in lower case Roman letters. One and half spacing should be adopted for typing the matter under this head.

5.7 List of Tables - The list should use exactly the same captions as they appear above the tables in the text. One and a half spacing should be adopted for typing the matter under this head.

5.8 List of Figures - The list use exactly the same captions as they appear below the figures in the text. One and a half spacing should be adopted for typing the matter under this head.

5.9 List of Symbols - Abbreviations and Nomenclature (Optional) - One and half spacing should be adopted for typing the matter under this head. Standard symbols, abbreviations etc. should be used.

5.10 Chapters - The chapters may be broadly divided Into 3 parts (i) Introductory chapter I (ii) Chapters developing the main theme of project, (iii) Results, Discussion, Conclusion and Future work.

The main text will be divided into several chapters and each chapter may be further divided into several divisions and sub divisions.

· Each chapter should be given an appropriate title.

· Tables and figures in a chapter should be placed in the immediate vicinity of the reference v/here they are cited.

· Footnotes should be used sparingly. They should be typed single space and placed directly underneath in the very same page which refers to the material they annotate.

5.11 Appendices - Appendices are provided to give supplementary information, which if included in the main text may serve as a distraction and cloud the central theme under discussion.

· Appendices should be numbered using Arabic numerals, e.g. Appendix 1, Appendix2, etc.

· Appendices, Tables and references appearing in appendices should be numbered and referred to at appropriate places just as in the case of chapters.

5.12 List of References – Includes the details of any works of other researchers that are used either directly or indirectly. The origin of the material thus referred to at appropriate places in the project should be indicated. A paper, a monograph or a book may be designated by the name of the first author followed by the year of publication, placed inside brackets at the appropriate places in the project should be indicated. A paper, a monogram or a book may be designated by the name of the first author followed by the year of publication, placed inside brackets at the appropriate place of reference. The citation may assume anyone of the following forms.

Examples of citation

i) A improved algorithm has been adopted in literature (Tsychiya 1980)

ii) Jankins and Walts (1968) have dealt at length this principle.

iii) The problem of mechanical manipulators has been studied by shin et al(1984) and certain limitations of the method used, has been pointed out by Shin et al (1984 a).

The listing should be typed 4 spaces below the heading "REFERENCES" in alphabetical order in single spacing left justified. The reference material should be listed in the alphabetical order of the first author. The name of author/ authors should be immediately followed by the year and other details. A typical illustrative list given below relates to the citation example quoted above.

REFERENCES

1. Ariponnammal S. and Natrajan S. (1984) "Transport Phonomena of Sm Se1x Asx" Pramana - Journal of Physics Vol. 42 No.5 pp.421-425

2. Barnard R.W. and Kellogg C. (1980) "Applications of Convolution operators to

Problems in univalent function theory. Michign Mach. J., Vol 27, pp 81-94

3. Janklns G.M. and Walts D.G. (1968), Spectral Analysis and its Applications"

Holder day, Sanfrancisco. .

3. Shin K.G.and Mcckay N.D. (1984) "Open loop minimum time control of mechanical

manupulations and its applications", Proc. Amer. Contr.Conf., San Diego, CA, pp.

1213-1236.

5.13 Tables and Figures - By the word table. It Is meant tabulated numerical data in the body of the project as well as in the appendices. All other non verbal material used in the body of the project and appendices such as Charts, graphs, maps, photographs and diagrams may be designed as figures.

· A table or figure including caption should be accommodated within the prescribed margin limits and appear on the page following the page where their first reference is made.

· Tables and figures on half page or less in length may appear on the same page along with the text. However, they should be separated from the text both above and below by triple spacing.

· All tables and figures should be prepared on the same paper or material used for the preparation of the rest of the project.

· For preparing captions, numerals, symbols or characters in the case of tables or figures, the computer should be used.

· Two or more small tables or figures may be grouped if necessary in single page.

· Wherever possible, the entire photograph(s) may be reproduced on full sheet of photographic paper.

· Photographs if any should be included in the colour xerox form only. More than one photograph can be included in a page.

6. TYPING INSRUCTIONS

6.1 General

This section includes additional information for final typing of the project. Some information given earlier under "Manuscript preparation" shall also be referred.

Thc impression on the typed/duplicated/printed copies should be black in colour.

If Computer printers are used uniformly of the font shall be observed.

Certain symbols, characters or markings not found on a standard typewriter may be hand written using Indian Ink or a Stylus pen (in case of stencil sheets are used). Corrections, interlineations and crossing out of letters or words will not be permitted in any of the copies of the project intended for submission. Erasures, if made, should be neatly carried out in all copies.

A sub-heading at the bottom of a page must have at least two full lines below it or else it should be carried over to the next page.

The last word of any page should not be split using a hyphen. One and half spacing should be used for typing the general text. Single spacing should be used for typing.

i) Long Tables

ii) Long quotations

iii) Foot notes

iv) Multi line captions

v) References

All quotations exceeding one line should bc typed in an indented. space the indentation being 155mm from either margins.

Double spacing should be used for typing the Bona fide Certificate and Acknowledgement.

6.2 Chapters

The format for typing chapter headings, division headings, sub division headings are explained through the following examples.

Chapter heading : CHAPTER 1

INTRODUCTION

Division heading : 1.1 OUTLINE OF PROJECT

Sub-division heading: 1.1.2.Literature review

The word CHAPTER without punctuation should be centred 50mm down from the top of the page. Two spaces below, the title of the chapter should be typed centrally in capital letters. The text should commence 4 spaces below this title, the first letter of the text starting 20mm, inside from the left hand margin.

The division and sub-division captions along with their numberings should be left justified, The typed material directly below it and should be offset 20mm from the left hand margin. Even paragraph should commence 3 spaces below the last line of the preceding paragraph, the first letter in the paragraph being offset from the left hand margin by 20 mm.

7. NUMBERING INSTRUCTIONS

7.1 Page Numbering

All Pages numbers (whether it be in Roman or Arabic numbers) should be typed without punctuation on the upper right hand corner 20mm from top with last digit line with the right hand margin. The preliminary pages of the project (such as Title page, Acknowledgement, Title of contents etc.) should be numbered in lower case Roman numerals. The title page will be numbered as (i) but this should not be typed, The page immediately following the title page shall be numbered (ii) and it should appear at the top right corner as already specified. Pages of main text, starting with Chapter 1 should be consecutively numbered using Arabic numerals.

7.2 Numbering of Chapters, Divisions and Subdivisions

The numbering of Chapters, divisions and subdivisions should be done using Arabic numerals only and further decimal notation should be used for numbering the divisions and subdivisions with in a chapter. For example sub division 4 under division 3 belonging to chapter 2 should be numbered as 2:3.4. The caption for the sub division should immediately follow the number assigned to it.

Every chapter beginning with the first chapter should be serially numbered using Arabic numerals. Appendices included should also be numbered in an identical manner starting with Appendix.

7.3 Numbering of Tables and Figures

Tables and Figures appearing any there in the project should bear appropriate numbers. The rule for assigning such numbers is illustrated through an example. Thus if a figure in Chapter 3, happens to be the fourth then assign 3.4 to that figure. Identical rules apply for tables except that the word Figure is replaced by the word Table. If figures (or tables) appear in appendices then figure 3 In Appendix 2 will be designated as Figure A 2.3. If a table to be continued into the next page this may be done, but no line should be drawn underneath an unfinished table. The top line of the table continued into the next page should, for example read Table 2.1 (contained) placed centrally and underlined.

7.4 Numbering the Equations:

Equations appearing in each Chapter or Appendix should be numbered serially, the numbering commencing a fresh for each Chapter or Appendix. Thus for example, an equation appearing in Chapter 2, if it happens to be the eighth equation in that Chapter should be numbered (2.8). Thus:

C(s) G1G2

-----------= ----------------------- 2.8

R(s) 1+G1 G2

While referring to this equation in the body of the project it should be referred to as Equation (2.8).

8. BINDING SPECIFICATIONS

· Project submitted for B.Tech/M.C.A (ONE copy) should be bound using flexible cover of thick white art paper. The spine for the bound volume should be of black Calico of 20mm width. The cover should be printed In black letters and the text for printing should be identical to what has been \prescribed for the title page.

· Two copies of the demonstration / implementation of the project work carried out should be submitted either in floppies or CDs.

Monday, March 31, 2008

CAMP BY FSF-AP CHAPTER

How Microsoft killed ODF

Hasn't anyone learned anything over the last few years. It doesn't matter if OOXML is approved or not. All that matters is that the process that gave ODF it's international standing is ruined. ODF got where it is today because it is an international standard, not because it is necessarily the answer to every possible question. People believed in the ISO process and believed that a standard with their seal of approval was actually worth something in the real world. By badgering, bribing and threatening, Microsoft has effectively destroyed the ISO process. So who cares if OOXML becomes a standard or not? No one if there isn't gold standard for it to be judged against. While ODF was a saint, the sinner of OOXML looked very dark and shabby. Now Microsoft has cast doubt on the lineage of ODF everyone is a sinner. If you will excuse an awful analogy, which would you prefer to eat, ice cream or sawdust? Easy choice eh! Now everyone knows that the ISO process can be corrupted the choice can then be portrayed as one dodgy standard versus another. So, what do you want to eat now, sawdust or coal? Not as clear cut any more is it! As soon as one national body fell to the manipulation of Microsoft, OOXML had won. In the world of FUD and dirty deals Microsoft is king. It's made a career out of muddying the waters to hide it's own inadequacies and inconsistencies. There is ( or maybe these days, was ) a saying that says 'My country right or wrong'. For some ISO members maybe we should change it to 'My company, right or wrong'


In this whole sordid tale, some people stood above the crud and for that they should be saluted... and some didn't ....

ref:fsdaily.com
post by rakesh

Tuesday, March 25, 2008

What is Copyleft?

Copyleft is a general method for making a program or other work free, and requiring all modified and extended versions of the program to be free as well.

The simplest way to make a program free software is to put it in the public domain, uncopyrighted. This allows people to share the program and their improvements, if they are so minded. But it also allows uncooperative people to convert the program into proprietary software. They can make changes, many or few, and distribute the result as a proprietary product. People who receive the program in that modified form do not have the freedom that the original author gave them; the middleman has stripped it away.

In the GNU project, our aim is to give all users the freedom to redistribute and change GNU software. If middlemen could strip off the freedom, we might have many users, but those users would not have freedom. So instead of putting GNU software in the public domain, we ``copyleft'' it. Copyleft says that anyone who redistributes the software, with or without changes, must pass along the freedom to further copy and change it. Copyleft guarantees that every user has freedom.

more>>>
posted by rakeshkumar
ref: fsdaily
csestuff.co.cc

Saturday, March 22, 2008

India votes NO for OOXML

After a colossal amount of debate and discussion over the last one year, India has finally voted NO for OOXML. Today the committee was asked "Should India change its September 2007 No vote into Yes?"

13 members voted No
5 members (including Microsoft, of course) voted Yes.
1 member abstained
3 did not attend

The government bodies, academic institutions and industry voted against OOXML. The only people who voted for OOXML were the software exporters--TCS, Infosys, Wipro and NASSCOM (National Association of Software Services Companies).

posted by rakeshkumar
www.close2job.com
ref: osindiablogspot.com

Impossible thing #2: Comprehensive free knowledge repositories like Wikipedia and Project Gutenberg

Project Gutenberg, started in 1971, is the oldest part of the modern free culture movement. Wikipedia is a relative upstart, riding on the wave of success of free software, extending the idea to other kinds of information content. Today, Project Gutenberg, with over 24,000 e-texts, is probably larger than the legendary Library of Alexandria. Wikipedia is the largest and most comprehensive encyclopedic work ever created in the history of mankind. It’s common to draw comparisons to Encyclopedia Britannica, but they are hardly comparable works—Wikipedia is dozens of times larger and covers many more subjects. Accuracy is a more debatable topic, but studies have suggested that Wikipedia is not as much less accurate than Britannica as one might naively suppose.

posted by rakesh
www.close2job.com
ref: freesoftwaremagazine.com

Making the impossible happen: the rules of free culture

In the mainstream, free culture is regarded with varying degrees of skepticism, disdain, and dewy-eyed optimism. It violates the “rules” by which we imagine our world works, and many people react badly to that which they don’t understand.

If the system of rules that we have based our entire industrial civilization on are wrong, will we have to face the prospect of re-ordering that society from the ground up? Will that civilization now collapse (like Wile E. Coyote falling once he notices there’s no ground underneath him)?

On the opposite extreme, for those who’ve given up on the rationalizations, preferring a “faith-based” approach, there is a great tendency to leap to magical thinking. Perhaps there are gods of freedom reordering the world to make it a happier place? If we shake our rattles hard enough, will all our dreams come true?

But where is genuine reason in all this? Here, I’ll present six “impossible” acheivements of free culture, each representing a particular challenge to the old paradigm. Then I’ll present a set of rules to help understand “how the magic works”, and give a more realistic framework for what can and can’t be expected from the commons-based methods on which free culture operates.

more>>
posted by rakeshkumar
www.close2job.com
ref:freesoftwaremagazine.com

Friday, March 14, 2008

Free-software lawyers: Don't trust Microsoft's Open XML patent pledge

Prominent legal counsel the Software Freedom Law Center said that the legal terms covering Microsoft's Open XML document formats pose a patent risk to free and open-source software developers.

The SFLC on Wednesday published a legal analysis of Microsoft's Open Specification Promise (OSP), a document written to give developers the green light to make open-source products based on specifications written by Microsoft.

The OSP is meant to allay concerns over violating Microsoft patents that relate to Open XML, Microsoft's document specifications that the company is trying to have certified as a standard at the ISO (International Organization for Standardization). For example, a company could create an open-source spreadsheet or server software that can handle Open XML documents.

Microsoft is awaiting the results of a crucial vote, expected by March 29, from representatives of national standards bodies.

But the SFLC said that the OSP is not to be trusted. It said that it did the legal analysis following the close of a recent Ballot Resolution Meeting held to resolve problems with the Open XML specification.

Specifically, the SFLC concluded that the patent protections only apply to current versions of the specifications; future versions could not be covered, it noted.

Also, software developers who write code based on a Microsoft-derived specification, such as Open XML, could be limited in how that code is used. "Any code that implements the specification may also do other things in other contexts, so in effect the OSP does not cover any actual code, only some uses of code," according to the analysis.

Finally, the SFLC said that OSP-covered specifications are not compatible with the General Public License (GPL), which covers thousands of free and open-source products.

Most open-source software advocates have opposed Microsoft's effort to standardize Open XML and the SFLC is no exception.

While not attempting to clarify the text of the OSP to indicate compatibility with the GPL or provide a safe harbor through its guidance materials, Microsoft wrongly blames the free software legal community for Microsoft's failure to present a promise that satisfies the requirements of the GPL. It is true that a broad audience of developers could implement the specifications, but they would be unable to be certain that implementations based on the latest versions of the specifications would be safe from attack. They would also be unable to distribute their code for any type of use, as is integral to the GPL and to all free software.

As the final period for consideration of OOXML by ISO elapses, SFLC recommends against the establishment of OOXML as an international standard and cautions GPL implementers not to rely on the OSP.

ref: fsdaily
rakeshkumar
close2job.com

Thursday, March 13, 2008

Microsoft's Open Specification Promise: No Assurance for GPL

There has been much discussion in the free software community and in the press about the inadequacy of Microsoft's Office Open XML (OOXML) as a standard, including good analysis of some of the shortcomings of Microsoft's Open Specification Promise (OSP), a promise that is supposed to protect projects from patent risk. Nonetheless, following the close of the ISO-BRM meeting in Geneva, SFLC's clients and colleagues have continued to express uncertainty as to whether the OSP would adequately apply to implementations licensed under the GNU General Public License (GPL). In response to these requests for clarification, we publicly conclude that the OSP provides no assurance to GPL developers and that it is unsafe to rely upon the OSP for any free software implementation, whether under the GPL or another free software license.

for more

ref: fsdaily
posted by rakeshkumar
www.close2job.com

Tuesday, March 11, 2008

Vitualization............

What is Virtualization?
Virtualization is a proven software technology that is rapidly transforming the IT landscape and fundamentally changing the way that people compute.

Today’s powerful x86 computer hardware was originally designed to run only a single operating system and a single application, but virtualization breaks that bond, making it possible to run multiple operating systems and multiple applications on the same computer at the same time, increasing the utilization and flexibility of hardware.

Virtualization is a technology that can benefit anyone who uses a computer, from IT professionals and Mac enthusiasts to commercial businesses and government organizations. Join the millions of people around the world who use virtualization to save time, money and energy while achieving more with the computer hardware they already own.

How Does Virtualization Work?
In essence, virtualization lets you transform hardware into software. Use software such as VMware ESX Server to transform or “virtualize” the hardware resources of an x86-based computer—including the CPU, RAM, hard disk and network controller—to create a fully functional virtual machine that can run its own operating system and applications just like a “real” computer.

Multiple virtual machines share hardware resources without interfering with each other so that you can safely run several operating systems and applications at the same time on a single computer.


ref:vmware
posted by www.close2job.com
rakesh kumar

Sunday, March 9, 2008

Virtualization era

QEMU is a processor emulator that relies on dynamic binary translation to achieve a reasonable speed while being easy to port on new host CPU architectures. In conjunction with CPU emulation, it also provides a set of device models, allowing it to run a variety of unmodified guest operating systems, thus is can be viewed as a hosted virtual machine monitor. It also provides an accelerated mode for supporting a mixture of binary translation (for kernel code) and native execution (for user code), in the same fashion as VMware Workstation and Microsoft Virtual PC. Qemu can also be used purely for CPU emulation for user level processes, in this mode of operation, it is most similar to valgrind.

Features

* Supports emulating IA-32 (x86) PCs, AMD64 PCs, MIPS R4000, Sun's SPARC sun4m, Sun's SPARC sun4u, ARM development boards (Integrator/CP and Versatile/PB), SH4 SHIX board, PowerPC (PReP and Power Macintosh), and ETRAX CRIS architectures.
* Support for other architectures in both host and emulated systems (see homepage for complete list).
* Increased speed — some applications can run in close to real time.
* Implements Copy-On-Write disk image formats. You can declare a multi-gigabyte virtual drive, the disk image will only be as large as what is actually used.
* Also implements overlay images. You can keep a snapshot of the guest system, and write changes to a separate image file. If the guest system breaks, it's simple to roll back to the snapshot.
* Support for running Linux binaries for other architectures.
* Can save and restore the state of the machine (programs running, etc.).
* Virtual network card emulation.
* SMP support.
* Guest OS does not need to be modified/patched
* Performance is improved when the KQEMU kernel module is used.
* Command line tools allow a full control of QEMU without having to run X11.
* Remote control of emulated machine via integrated VNC server
* USB tablet support — this provides "grabless" mouse control. Activated with "-usb -usbdevice tablet".

How to love Free Software in 3 steps: configure, make, make install

Tinkering with system core files, first aid kit

Let’s take a common example: you completely upset Windows XP’s core system DLL. Surprisingly, the OS still works. Explanation: the system dynamically replaces modified/removed core files from a hidden backup cache. This system already existed in Windows 2000, but in XP it covers pretty much all base install files (including Messenger). However, try removing all copies of the file you want to modify, all at the same time, refuse to restore the file from CD and see the system crash and burn.

In effect, for Joe User, you can’t corrupt the system because you are actively prevented from tinkering with it, and the system automatically reverts anything you try to do to it while it’s running. Moreover, the fact that Windows locks down opened files makes it difficult to really put the system down before a reboot (well, in that specific case anyway).

Under GNU/Linux and xBSD: when you update a system file, create a backup. Also, it’s a good idea to learn what a minimal booting system requires. Add to that, there is no way you can’t restore your system from a boot CD if you’ve kept a backup of your modified files: there’s no checksum of the ‘correct’ files stored in a registry somewhere that would prevent you from restoring backup files. Last but not least, most package managers allow you to ask for a package reinstall which will reset all its settings to default.

Finally, library versioning under UNIX-like systems is quite developed: not only can you host several versions of a library, soft links and rules for dynamic linking allow you to create a special version of a library which will be linked to by a single software, without much trouble.

In short, there is little reason you can trash a Linux system in an unrecoverable way even if you tinker with system files, except if you go at it as root, with a hammer and matching subtlety.

Now though, it’s not GNU’s or BSD’s or (usually) Windows’ fault if you trash the hard disk.
Tinkering with partitions, the pitfalls

There are three great sources of damage to GNU/Linux partitions:

*

outdated boot manager data (LILO or GRUB); it usually happens after a kernel update not followed by a GRUB or LILO refresh
*

badly enumerated partitions; it usually happens when removing, resizing and moving partitions on a complex layout disk,
*

overwritten Master Boot Record; it usually happens when you install Windows XP or Vista (Windows 2000 is a better citizen here).

The pitfalls are various, and can indeed make one wonder. However, at least with GNU/Linux you can hope for a recovery, while an OS like Windows will often require a reinstall (a cloned partition of mine insisted on calling itself ‘F:' after restore, no way to boot the system to correct that, and the registry hives all got corrupted).

The first problem is easy to avoid: keep a working kernel installed as long as you’re not sure the second one works, and always update LILO or GRUB after you’ve tinkered with kernels.

For the second case, if you start resizing, destroying and creating partitions all over the place, make sure you have an efficient LiveCD on hand (Knoppix being a reference): not only it is a good recovery tool, it’s also better not to work on a ‘live’ system (not that it’s impossible, just that it saves you from juggling with chroot all the time): it will allow you to revert partition changes and/or update your ‘live’ /etc/fstab file in a matter of minutes. Moreover, once you’re editing this file, several options are open to you.

You have two ways to address a peripheral in /etc/fstab to mount it: either you call it through /dev (like /dev/sda1), or you use its UUID: the latter is much harder to write off the top of your head, but on the other hand it makes using a roaming GNU/Linux system much easier. Moreover, it doesn’t fall prey to partition resizing and movement troubles as easily as the /dev path method.
Crush the kernel

Some distributions allow much easier tinkering with the kernel than others: Mandriva for example allows you to build “vanilla” kernel sources with a few command lines, while Ubuntu is much more painful because it won’t automatically build the kernel’s RAMdisk image that contains required modules for boot. You can find more information on your distribution’s forums (on top of that, forum posts from one distribution may apply to another; on the matter of compile time options, the Gentoo forum is a gold mine).

When it’s a matter of adding kernel modules, the system gives you enough warnings before you do something stupid, to prevent you from crashing your system:

*

if the module isn’t provided with the vanilla kernel, it may not be very stable;
*

if the module isn’t provided with the distribution’s kernel, it is probably quite unstable;
*

if dmesg returns kernel version mismatch on module load, it may not even work at all;
*

if dmesg returns symbol mismaches, you’re trying to fit a square peg in a round hole.

After that, if you insist on loading the module and force it, pray. Just, pray. Hard.
Compilation: when to do it

There are 3 cases in which you may want to compile a package from source:

1. it isn’t provided by your distro, and you need it (have you checked all available sources? Alternate repositories?)
2. the provided version is old, or buggy, or slow
3. the provided version hasn’t been compiled with the options you want

For 1, you could try to get a package from another distro: alien and smart even allow you to use Debian packages on Red Hat packaged distros and vice versa, so where’s the point? Except when you just know that the package is easy to compile (but then, you wouldn’t read these lines and be done already).

For 2, well, same as option 1. If you want to play with fire, be ready to get burnt: backups, backups, backups. Of course, a properly set up kernel source can give your machine a sensible performance boost (disabling multiple processors support on single core systems, compiling with specific processor options instead of generic i586 for 32-bit systems, disabling all debugging options, those may be worth it; but the rest isn’t worth the trouble).

For 3, you could get your distro’s source package: it should be provided with original build instructions, and at least you’ll be sure that other installed software won’t baulk.

In short: do you really need to compile from source?
Compilation 2: what to do, what not to do

First, Read The Friggin’ Manual! It could be the README or INSTALL file provided with most source packages. If the package isn’t provided with a configure file, you have two options:

*

either the tarball contains an experimental snapshot of the software, and you’ll need to build the makefile yourself, with automake or cmake; READ the instructions to know which is required and recommended for your system!
*

or the tarball doesn’t require a makefile (the source files perform checks themselves, or the build system is preset, not requiring a configure script): you can skip to the make step.

If you can’t find any build instruction, try running make right away. If it doesn’t work, make will tell you what it’s missing. Go back to the top, lather, rinse, repeat.
Start configuration

As said before, you’ll probably find build instructions inside the tarball, one way or another. If you have a configure script but no manual, try ./configure --help to get a list of options. Use the ones you need (for example, many programmers compile to put compiled binaries inside /usr/local, but in many distributions you actually want to put everything inside /usr), and get started.

For example, ./configure --prefix=/usr will set up the system to put everything inside your ‘main’ system directories. For a first time compilation, it’s not recommended: put them somewhere else instead (like /usr/test, or something), you’ll later use soft links to make use of them instead of already installed libraries.

The script will run; look at it attentively, even the least customized one will tell you what it’s looking up. Once the process is finished, try to install as many of the missing packages (both libraries and development packages) as you deem necessary, and run ./configure again. Again, check it out closely. Some are more verbose than others, but at least you make sure that you won’t be missing too much right from the start.

Once there, it’s time for make: watch outputs closely, as an aborted compilation will, more often than not, be preceded by log messages like ‘line XXX: undeclared variable #something’. If you get several of them, it usually indicates a missing library not covered by ./configure (if it actually happens, it’s a good idea to inform the developer about it); install the library with the name closest to the missing variable, and try again. Now it should work.

Lather, rinse, repeat. Eventually, you’ll get a built package. make install usually covers the rest (just remember what prefix you’ve set up at the ./configure step), but I don’t recommend it right away: run the local binary first, and see if it works. Moreover, look up what files may be replaced: if possible, uninstall conflicting packages first.
Compilation done, what next?

Well, if you reach that point, you’re as far along as Bill Gates was when he shipped Windows 95 (‘it compiles! Quick, ship it!’). Next step, use your new system library. If you’ve followed my advice but couldn’t remove the package (too many dependencies), you now have an original library in use, and an unused custom one; move to where you want to put the file. Rename the old one (mv libthingie.so libthingie.so.old), and create a symbolic link to your new one (ln -s /usr/test/lib/libthingie.so .), run ldconfig and (re)start one piece of software that makes use of this library. If it works, restart all other processes. If it crashes, undo what you did and restore the older library (rm libthingie.so && mv libthingie.so.old libthingie.so) and check again that you compiled your library correctly (be careful about 32/64-bit, the failing process will complain about symbol mismatch).

If you intended to build a kernel module (say, the highly unstable mach64 DRM module), then following the manual just works: all you need to do is git extract, make, then manually copy both drm.ko and mach64.ko to your kernel tree. Just don’t forget to make a backup of the original drm.ko (or drm.ko.gz) file somewhere safe and to run depmod -a after copy. gzip them if you want, then try modprobe mach64 and look at dmesg | tail’s output to be sure there is no error.

Right then, you can restart Xorg and see if DRI is enabled: on some systems, it’ll load Xorg and composited GNOME, KDE or Xfce fine, until you try to display texture mapped polygons; on others, it’ll give you a black screen and sometimes a nice hard system hang, so it would have been a good idea to set initial init level in /etc/inittab to 3 (Mandriva, SuSE) or 1 (Ubuntu), or to have a backup kernel image somewhere.
Conclusion

I’m not qualified as a programmer; to be frank, I’ve never typed a line of C/C++ in my life. However, I’ve successfully built several packages from sources, by taking to heart a few simple instructions:

* read the manual
* read the manual again, because some stuff at the beginning may make more sense
* read the release files, as they sometimes contain contradictions with the manual, but at least you’ll know what to expect
* check out compile options, as they sometimes contain more precisions or the latest build syntax
* read the manual one more time, because you won’t remember it very well byt this stage
* read script and compilation options, there could be some stuff hidden here
* make backups, and keep a LiveCD handy.

Once you’ve gotten used to it, compiling a package from sources (be it the kernel, a module, an Xorg driver, a library, whatever) and making use of it is no more difficult than reading a cooking recipe for making a pizza: just don’t forget to read it and to keep flour on your hands, otherwise the dough will stick


reference :fsdaily
published by rakeshkumar
www.close2job.com

Saturday, March 8, 2008

ABOUT INDYAROCKS

About Indyarocks - Indyarocks is the fastest growing online and mobile community for Indians across the globe. Here you can create your profile, meet and make friends, share photos, blogs, post classifieds, catch up with the latest movie buzz and much more..
Who all can join indyarocks
# Friends
# Families
# Classmates
# Co-workers
# Professionals
# Artists
# Organizations and
# Everyone who wants to be in regular touch with their friends

The free software movement is a political cause, not a technical one.

The free software movement is a
political cause, not a technical one. "Choose based on technical
criteria first of all" is the opposite of what we say.

There are many reasons why GNU packages should support other GNU
packages.

The GNU Project is not just a collection of software packages. Its
intended result is a coherent operating system. It is particularly
important therefore that GNU packages should work well with other GNU
packages. For instance, we would like Emacs to work well with git or
mercurial, but we especially want it to work well with Bzr.

The maintainers of one GNU package should use other GNU packages so
they will notice whether the packages work well together, and make
them work well together.

We also promote use of other GNU packages in this way.
Other people don't necessarily see which editor you use,
but they all see what dVCS you use.

regards
rakesh

Wednesday, March 5, 2008

Too Many Patents? How Patent Inflation Plagues Information Technology

In 2004, Brandeis economist Adam Jaffe and Harvard Business School professor Josh Lerner published Innovation and its Discontents: How Our Broken Patent System is Endangering Innovation and Progress, and What to Do About It - a rare book on patents and written for generalists, not patent lawyers. "Broken" is strong language, but it gets attention.

Jaffe and Lerner argue that patents had become too easy to get and too powerful:

[W]e converted the weapon that a patent represents from something like a handgun or a pocket knife into a bazooka, and then started handing out the bazookas to pretty much anyone who asked me for one, despite the legal tests of novelty and non-obviousness. [p. 35]

They attribute this to two congressional decisions: creating a specialized patent appeals court, the Court of Appeals for the Federal Circuit, in 1982; and putting the Patent and Trademark Office (PTO) on fee-funded basis.

Under the intellectual leadership of former patent attorney Giles Rich, the Federal Circuit spent much of its first 16 years enhancing the prospects of patent applicants and patent holders. The highwater mark was the notorious 1998 State Street decision, which Rich authored and which summarily eliminated the longstanding exclusion of patents for business methods. (1) Suddenly, patents were no longer limited to technology but available for any form of human activity.

By tying the PTO's budget to the fees it collected, Congress would inspire a new PTO mission, "to help customers get patents." Under the fee structure prescribed by Congress, the agency lost money on examinations but made money from issuance and maintenance fees. This internal cross-subsidy gave the agency an incentive to grant patents rather than deny them. (2) It embraced the flood of non-technological patents that followed State Street, arguing in international harmonization negotiations that allowing patents for all activities, not just technology, was "best practice." With support only from patent organizations, the U.S. delegation threatened to walk out of the negotiations if other governments did not go along. (3)

Patents on Intangibles

While there were rumblings in Congress following the State Street decision, the 54-person board of the Intellectual Property Owners Association resolved unanimously that Congress should keep its hands off business method patents. For good measure, it passed the resolution again the following year.4 But were the board members speaking for upper management -- or for corporate patent departments? Even IBM signed on -- although IBM also went on record opposing business method patents, noting the fundamental problem with patents on intangibles: "[W]ith the advent of business method patenting it is possible to obtain exclusive rights over a general business model, which can include ALL solutions to a business problem, simply by articulating the problem." (5)

But patent institutions have a natural self-interest in expanding the scope and scale of the patent system. As one treatise puts it:

[B]road notions of patent eligibility appear to be in the best interest of the patent bar, the PTO, and the Federal Circuit [CAFC]. Workloads increase and regulatory authority expands when new industries become subject to the appropriations authorized by the patent law. Noticeably absent from the private, administrative and judicial structure is a high regard for the public interest. (6)

For similar reasons, the patent bar has also favored low standards of patentability. When the Supreme Court heard oral arguments in KSR International v. Teleflex, the attorney for Teleflex rushed to the defense of the Federal Circuit's low standard: "[R]emember, every single major patent bar association in the country has filed on our side". To which Justice Scalia countered that a low standard "produces more patents, which is what the patent bar gets paid for, to acquire patents, not to get patent applications denied but to get them granted. And the more you narrow the obviousness standard ... the more likely it is that the patent will be granted." Indeed, 40 years previously in Graham v. Deere, the Supreme Court was called on to interpret the standard in the 1952 Patent Act. Then, too, the patent bar lined up to claim that Congress had lowered the standard. Then, too, the Supreme Court disagreed.

The brief effort to rein in business method patents in 2000-01 was stymied not only by the unified voice of the corporate patent departments, as well as the instant constituency of new patent holders and applicants. No reform bill introduced in the last few years has dared touch on subject matter limitations. However, the current House bill was amended to include a provision against patents on tax-avoidance strategies, a particularly obnoxious intrusion of the patent system into a very different policy domain. And a narrow provision restricting remedies for infringement of check imaging patents was added to the Senate bill.

The Supreme Court last addressed abstract subject matter in 1981, since which the Federal Circuit has made virtually anything patentable. Yet information technology has transformed the U.S. economy, not by identifiable patents, but by a powerfully enabling stack of open, unpatented protocols that we know as the Internet and the Web. (7)

Not until 2005 did the Court revisit patentable subject matter by agreeing to review Labcorp v. Metabolite, a case involving a medical diagnosis rather than software or methods. But the Court took the unusual step of reversing course and choosing not to decide the case, although three judges dissented against the decision not to decide, making it clear they would have rejected patentability.

Why has it taken 26 years and counting for the Court to focus on the critical distinction between abstract ideas and patentable subject matter?

Few litigants want to raise this issue before the Federal Circuit, since State Street seemed to state so strongly that anything is patentable as long as it's useful. Why stick your finger in the eye of the appeals court if you've got a fighting chance on other issues? Why risk ostracism from your brethren by advocating limitations on the scope and status of the profession?

Nonetheless, the inter-industry tensions over reform put the subject matter issue in a new light. AT&T v. Microsoft dealt with an obscure provision of the patent code concerning foreign assembly of components to create products that would infringe in the U.S. -- and whether this applied to reproducing software on media from a master disk. Eli Lilly filed an amicus brief blaming the whole controversy on the Federal Circuit's allowance of patents on intangible subject matter, signed by Eli Lilly's chief patent counsel, a past president of the American Intellectual Property Law Association.

Why would a drug company question patents for intangibles? It is no coincidence that the push for strong patent reform was originally spearheaded by the Business Software Alliance, and strong reform is supported by the financial services sector. Take away patents on intangibles, and much of the momentum behind reform evaporates. In the interests of preserving a unitary patent system in the traditional pharmaceutical model, it makes sense to lop off any outlying troublemakers, such as software and business methods. Although removing patents on intangibles would eliminate a vast source of income for patent professionals, the system would then remain narrowly focused on process in the hands of patent professionals -- and less pressured by the interests of nontraditional sectors.

Portfolio Patenting

It would not be easy for a field as diverse as "software" to agree to opt out, given the accumulation of patents at different levels of abstraction and the proliferation of business models, some of which are more patent-dependent than others. When the Patent and Trademark Office held hearings in 1994, almost all pureplay software publishers (with the notable exception of Microsoft) expressed opposition to software patents. But since then, all have amassed their own patent portfolios, giving them broad protection in the market niche they have traditionally occupied

Portfolios turn the mythology of the patent system upside down. The policy justification of portfolio patenting in IT, expressed by Thinkfire CEO Dan McCurdy as "net users pay net innovators," makes for rough justice. However, it is different from the classic case for individual patents. Instead of protecting the upstart inventor armed with a patent, the system protects established companies who have had the time and resources to assemble substantial portfolios that function as renewable "thickets" to keep incumbents ensconced and to discourage new entrants from assembling full-blown products.

But there is a downside for established companies, too, that has recently become clear. The same conditions that allow them to amass vast portfolios easily also provide fertile ground for trolls. Lots of easy-to-get patents ensure that some will end up in the speculators, some of whom will get lucky and find their patent is deeply embedded in the complex technology of a successful product.

Portfolio-driven patenting is not unique to software. It pervades IT and, to a lesser extent, other complex technologies, but anybody can generate patentable functionality in software. Software democratizes innovation. Writing software requires no laboratory, no PhD, no manufacturing plant, no distribution chain. Meanwhile, low standards, the presumption of entitlement, and the desire to impress supervisors, upper management, and venture capital induce the filing of tens of thousands of patents each of which may have dozens of claims.

The flipside of massively dispersed patent ownership is massively dispersed liability. Patents of failed companies often end up in the hands of trolls who are neither innovators, nor producers, nor users -- and have no need to license rights from others. Can those loose patents be avoided? At what cost? -- not only to identify problem patents but to figure how good they are, who owns them, and under what terms they might be available.

Patent thickets impose huge costs because they require the assistance from lawyers - quite apart from the costs of acquiring rights or designing around them. The tactically correct solution is not to search but to task lawyers to solve problems only if and when they arise. (8) At the same time, this jungle of rights and miasma of too much information to decipher and interpret creates cover for trolls. They can hide until producers and users have made huge investments in arguably infringing products. For trolls, patents are lottery tickets: if they are lucky, they will be infringed by a deep-pocketed producer. For producers, it's a risk of an aberrant judgment which can perhaps be averted by flinging enough legal resources against it.

Other than anecdotal evidence, including the sad experience of the insurance industry, (9) it is virtually impossible to get a direct handle on these risks. However, new research by James Bessen and Michael Meurer, soon to be published in a book, Patent Failure, does so indirectly. (10) By examining market reaction to patent litigation, they show how investors view the risks and costs of patents imposed on different sectors. For software and business methods, these are very high indeed.
*****

Microsoft Rises to Sixth on Patent List for 2007

Microsoft was awarded more than 1,600 patents by the U.S. Patent and Trademark Office (USPTO) in 2007, placing it sixth on the list of biggest patent performers, according to IFI Patent Intelligence, which tracks patent awards. IBM, which tried but failed to patent outsourcing last year, won the patent count for the 16th straight year, with more than 3,100 patents.

One way to gauge the level of innovation occurring in the IT industry is to count the number of patents awarded to companies. Since the organizations getting the most patents year after year tend to be developers of hardware and software for businesses and consumers (except for the occasional car maker, such as Honda Motor, which ranked 19th in 2007), this would seem to be a fairly accurate way to tell who has the most creative and productive research and development departments. (IBM's attempt to patent outsourcing was quite creative, but it wasn't productive.)

However, since the USPTO stopped publishing the list of companies receiving the most patents last year, under the assumption that focusing on patent counts was a poor way of gauging creativity, interested parties must now count the patents themselves. Or, if they have better things to do, they can turn to IFI Patent Intelligence, an outfit out of Wilmington, Delaware, to do the heavy counting.

According to IFI's analysis, Microsoft was awarded 1,637 patents last year, nearly a 12 percent increase in the number of patents it received in 2006, when it was number 12 on the list.

Microsoft's increase in patents bucked the trend in patents last year, which saw nearly a 10 percent decline in the number of patents issued by the USPTO.

Darlene Slaughter, general manager of IFI Patent Intelligence, says the 157,284 utility patents issued last year was more or less in line with recent historical averages. "Although the total number of patents issued is down from 2006's record high, it did beat 2005's relatively low showing," she says. "Overall, it's fair to say that 80 percent of the top 35 organizations were down versus the previous year."

There is currently a huge backlog of patents pending, according to IFI. The most recent USPTO annual report shows there were more than 1.1 million patents pending for fiscal year 2007, which means that slightly more than 10 percent of patents applied for are actually granted.

Here's IFI's list of top 10 patent performers of 2007, followed by the number of patents they received:

* IBM--3,148
* Samsung Electronics--2,725
* Canon--1,987
* Matsushita Electric Industrial--1,941
* Intel--1,865
* Microsoft--1,637
* Toshiba--1,549
* Sony--1,481
* Micron Technology--1,476
* Hewlett-Packard--1,470

Thursday, February 28, 2008

Wikipedia, Wikia and the Future of Free Culture

Jimmy Wales, founder of Wikipdia talks about how Wikipedia aims to give people access to the sum of human knowledge.

He describes how Wikipedia is very popular across the globe. Hundreds of thousands of articles in numbers of languages.

He notes how Wikipedia is a charitable organization with only 12 employees. Yet, look at the affect they have on the world.

Spent $1,000,000 in 2006 - supported by small donations from people around the world. (Also gets donations of caching servers around the world.)

Wales goes on to define, what he means by free access to the sum of human knowledge. “Free as in speech, not as in beer.”

The Wikipedia’s license allows you the:
Freedom to copy
Freedom to modify
Freedom to redistribute
Freedom to redistribute modified versions

He goes on to further refine his term the sum of all human knowledge. He describes Wikipedia as an encyclopedia, not a data dump. This is a Global Movement to gather content in every single language.

The folks from Wikipedia ask people to build their own wikipedia in their own language. They have a “Wikipedia academy” which offers seminars on how to edit wikipedia. (note: check http://icommons.org)

He notes, it takes 5 to 10 regular users in a community to sustain regular posting in a paricular language / about a particular subject.

For instance, he tells the story of the “Father of Swahili”. Every night he wrote articles for Wikipedia. This person then reached out to Swahili language bloggers. Then the 5 to 10 contributors gathered together and started helping each other.

Wales goes on to describe Wikia as a completely separate venture. Wikia is like every other kind of book, only it is writing that people build as an online community.

Goal of Wikia is for profit, but it still is freely distributed.

He offers an explanation about why this all is happening… The internet is all about consumer media.

He cites the example of the “Muppet Wiki” as a resource for community created long tail content.

He cites bloggers as performing the function of “Armchair Analysis” and says that “a good blog is equal to the editorial section of the New York Times.”

He points out that Wikipedia aims to embrace neutrality as a core value that Wikipedia should not take a stand on any controversial issue and that it should describe the fight rather than taking a stance.

The nature of wikis is such that there is the potential for content to be destroyed. But points out that writing that survives is writing that people can mutually agree is an accurate description.

In a moment of self-assessment, he notes that Wiki News reporting is not too good. He notes that “News requires infrastrucutre, ability to be patient and wait for news to happen.” Although, one potential exception that he sees as fertile ground for exploration would be for crowdourced sports reporting.

He points out the following lessons for public broadcasters:

- a lot of this is made possible by the creative commons licensing framework

- he asks us to think about how new content could be created from public broadcasting content. To think about community reuse.

- he asks us to think about how we can you release content in a way where we can get people interested in what we are doing. For example: he talks about how he worked to persuade art museums to make high quality photograph available for use on wikipedia - and how that encourages people to go to the museum.

The next section of the presentation is on Wikia and the future of search

(Insert link to Wikia Labs)
Note: this is a political statement about Open Source, about Open Access.

Here, Wales is offering people all t necessary software to set up a Free Search Engine - would this be useful for Jake Shapiro’s Pirate Media Bay initiative? Would this be useful for PBCore?


from
rakesh kumar
www.close2job.com

Tuesday, February 26, 2008

Setting software free: Eben Moglen and digital age morality

Reading Eben Moglen’s keynote address, “Freeing the Mind: Free Software and the Death of Proprietary Culture,” I felt a bit like Richard Stallman while he worked to replace UNIX with GNU: reaching the same destination but apprehensive about the other guy’s route.

Moglen, a law professor and founder of the Software Freedom Law Center, discusses free software v. the behemoths of software largely in moral terms. Software is a public utility, he argues, and one that must be recognized as such because, in the 21st century, it is the cornerstone of “the ethical right to share information” (Moglen 2003). And, for the first time in history, information isn’t weighed down by the heavy, expensive, hard-to-transport molecules upon which it was once carried—pages in books, magnetic clusters on videocassettes, optical stamps on compact discs. Now that everything can be transferred freely through the Internet, it is revealed that information was free all along; it was just those pesky info-bearing artifacts we were paying for.

Moglen goes on to say: “every piece of useful or beautiful information can be distributed to everybody at the same cost that it can be distributed to anybody. For the first time in human history, we face an economy in which the most important goods have zero marginal cost” (Moglen 2003). Quite true. But miraculous as a zero marginal cost may be, Moglen neglects to tackle the issue of the cost of the first copy. This was true long before the Internet.

The first copy of a book is exorbitantly expensive, because the costs begin when the author puts the first page into the typewriter and starts typing. Then there is the author’s time, effort, grocery bills, electricity bills, water, heat, trips to the library—and all before the book even gets to a book publisher, which the digital age purports to replace. The great thing about printing is that the cost of subsequent copies approaches zero. In the digital age, the cost of copies two to two million is zero. Wondrous as it is, it doesn’t erase the time, effort and money that went into copy one.


ref : fsdaily

from
rakeshkumar
www.csestuff.co.cc
www.close2job.com