国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

python實時分析日志的一個小腳本分享

2020-02-16 01:30:18
字體:
供稿:網(wǎng)友

前言

大家都知道Web運維總要關(guān)注相關(guān)域名的實時2xx/s、4xx/s、5xx/s、響應(yīng)時間、帶寬等這些指標(biāo),之前的日志是五分鐘一分割,簡單的用awk就可以了,現(xiàn)在由于要推送日志到ELK,繼續(xù)之前五分鐘一分割會有問題,就改為一天分割一次。改成一天一分割后,顯然再繼續(xù)用Shell就不合適了,于是就用Python寫了下。

方法如下:

腳本主要運用了文件的seek和tell函數(shù),原理如下:

       1.加入crontab,每5分鐘執(zhí)行一次

       2.只分析從上次讀取日志文件的結(jié)束位置到這次讀取文件時的末尾位置之間的日志,出結(jié)果
可以使用zabbix_sender把結(jié)果發(fā)送到zabbix server或者直接使用zabbix agent來讀取這個文件取數(shù)據(jù),配合zabbix出圖、做報警,代碼如下:

#!/usr/bin/env python#coding: utf-8from __future__ import divisionimport osLOG_FILE = '/data0/logs/nginx/xxxx-access_log'POSITION_FILE = '/tmp/position.log'STATUS_FILE = '/tmp/http_status'#crontab 執(zhí)行時間CRON_TIME = 300def get_position(): #第一次讀取日志文件,POSITION_FILE為空 if not os.path.exists(POSITION_FILE):  start_position = str(0)  end_position = str(os.path.getsize(LOG_FILE))  fh = open(POSITION_FILE,'w')  fh.write('start_position: %s/n' % start_position)  fh.write('end_position: %s/n' % end_position)  fh.close()  os._exit(1) else:  fh = open(POSITION_FILE)  se = fh.readlines()  fh.close()  #其他意外情況導(dǎo)致POSITION_FILE內(nèi)容不是兩行  if len(se) != 2:   os.remove(POSITION_FILE)   os._exit(1)  last_start_position,last_end_position = [item.split(':')[1].strip() for item in se]  start_position = last_end_position  end_position = str(os.path.getsize(LOG_FILE))  #日志輪轉(zhuǎn)導(dǎo)致start_position > end_position  #print start_position,end_position  if start_position > end_position:   start_position = 0  #日志停止?jié)L動時  elif start_position == end_position:   os._exit(1)  #print start_position,end_position  fh = open(POSITION_FILE,'w')  fh.write('start_position: %s/n' % start_position)  fh.write('end_position: %s/n' % end_position)  fh.close()  return map(int,[start_position,end_position])def write_status(content): fh = open(STATUS_FILE,'w') fh.write(content) fh.close()def handle_log(start_position,end_position): log = open(LOG_FILE) log.seek(start_position,0) status_2xx,status_403,status_404,status_500,status_502,status_503,status_504,status_all,rt,bandwidth = 0,0,0,0,0,0,0,0,0,0 while True:  current_position = log.tell()  if current_position >= end_position:   break  line = log.readline()  line = line.split(' ')  host,request_time,time_local,status,bytes_sent = line[1],line[3],line[5],line[10],line[11]  #print host,request_time,time_local,status,bytes_sent  status_all += 1  try:   rt += float(request_time.strip('s'))   bandwidth += int(bytes_sent)  except:   pass  if status == '200' or status == '206':   status_2xx += 1  elif status == '403':   status_403 += 1  elif status == '404':   status_404 += 1  elif status == '500':   status_500 += 1  elif status == '502':   status_502 += 1  elif status == '503':   status_503 += 1  elif status == '504':   status_504 += 1 log.close() #print "status_2xx: %s/nstatus_403: %s/nstatus_404: %s/nstatus_500: %s/nstatus_502: %s/nstatus_503: %s/nstatus_504: %s/nstatus_all: %s/nrt: %s/nbandwidth: %s/n" % (status_2xx/CRON_TIME,status_403/CRON_TIME,status_404/CRON_TIME,status_500/CRON_TIME,status_502/CRON_TIME,status_503/CRON_TIME,status_504/CRON_TIME,status_all/CRON_TIME,rt/status_all,bandwidth/CRON_TIME) write_status("status_2xx: %s/nstatus_403: %s/nstatus_404: %s/nstatus_500: %s/nstatus_502: %s/nstatus_503: %s/nstatus_504: %s/nstatus_all: %s/nrt: %s/nbandwidth: %s/n" % (status_2xx/CRON_TIME,status_403/CRON_TIME,status_404/CRON_TIME,status_500/CRON_TIME,status_502/CRON_TIME,status_503/CRON_TIME,status_504/CRON_TIME,status_all/CRON_TIME,rt/status_all,bandwidth/CRON_TIME))if __name__ == '__main__': start_position,end_position = get_position() handle_log(start_position,end_position)            
發(fā)表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發(fā)表
主站蜘蛛池模板: 河间市| 民和| 牡丹江市| 利川市| 黔东| 会宁县| 会同县| 金平| 河北省| 玉树县| 五大连池市| 定边县| 孟津县| 黄骅市| 汝南县| 涿鹿县| 南康市| 裕民县| 巴彦淖尔市| 来凤县| 鸡泽县| 揭东县| 霍林郭勒市| 聂拉木县| 襄城县| 会同县| 始兴县| 彝良县| 赤城县| 东阳市| 武平县| 杭州市| 吉木乃县| 高州市| 巴彦县| 安龙县| 唐山市| 界首市| 新郑市| 兴业县| 菏泽市|