As the size of web keep on growing, the job of a web crawler has become more cumbersome in covering maximum web. There are enormous numbers of web pages available over the web but out of them a smaller number is available to user. A sitemap is an XML file in which a list of URLs of that particular site is mentioned. The sitemap protocol may play a very important role in covering maximum web. In this paper, the information provided by sitemap protocol is used for the purpose of crawling the quality web pages. With the help of sitemap, web crawlers will maintain their repository up-to-date. It also tries to make user to access maximum pages over the web by covering as maximum as possible while crawling. It also helps crawler to visit the pages based on their change frequency and downloads updated pages only, thereby reduces unnecessary network traffic.
목차
Abstract 1. Introduction 2. Related Work 3. Design of an Efficient Migrating Crawler based on Sitemaps 3.1 Architecture of Proposed Work 4. Performance Analysis 5. Efficiency of Sitemap 5.1. Web Coverage 5.2 Preserves Bandwidth 5.3 Co-operation between Migrating Agents 6. Conclusion References
키워드
web crawlersitemapfreshnesscoverageMigrating agent
저자
Deepika [ Computer Engineering Department, YMCAUST, Faridabad, India ]
Dr Ashutosh Dixit [ Computer Engineering Department, YMCAUST, Faridabad, India ]
보안공학연구지원센터(IJAST) [Science & Engineering Research Support Center, Republic of Korea(IJAST)]
설립연도
2006
분야
공학>컴퓨터학
소개
1. 보안공학에 대한 각종 조사 및 연구
2. 보안공학에 대한 응용기술 연구 및 발표
3. 보안공학에 관한 각종 학술 발표회 및 전시회 개최
4. 보안공학 기술의 상호 협조 및 정보교환
5. 보안공학에 관한 표준화 사업 및 규격의 제정
6. 보안공학에 관한 산학연 협동의 증진
7. 국제적 학술 교류 및 기술 협력
8. 보안공학에 관한 논문지 발간
9. 기타 본 회 목적 달성에 필요한 사업
간행물
간행물명
International Journal of Advanced Science and Technology
간기
월간
pISSN
2005-4238
수록기간
2008~2016
십진분류
KDC 505DDC 605
이 권호 내 다른 논문 / International Journal of Advanced Science and Technology Vol.83