当前位置: 首页> 健康> 科研 > 万能浏览器有哪些_手机制作网站免费_网页制作成品模板网站_百度关键词优化服务

万能浏览器有哪些_手机制作网站免费_网页制作成品模板网站_百度关键词优化服务

时间:2025/7/12 15:13:41来源:https://blog.csdn.net/weixin_48502062/article/details/144851456 浏览次数:0次
万能浏览器有哪些_手机制作网站免费_网页制作成品模板网站_百度关键词优化服务

本节重点介绍 :

  • pre_query项目配置文件设计
  • ansible-copy拷贝日志文件
  • 解析日志文件并判断重查询

新建python项目 pre_query

设计配置文件

  • config.yaml

prome_query_log:prome_log_path: /App/logs/prometheus_query.log # prometheus query log文件pathheavy_query_threhold: 5.0                    # heavy_query阈值py_name: parse_prome_query_log.py            # 主文件名local_work_dir: /App/tgzs/conf_dir/prome_heavy_expr_parse/all_prome_query_log # parser拉取query_log的保存路径check_heavy_query_api: http://localhost:9090  # 一个prometheus查询地址,用来double_check记录是否真的heavy,避免误添加redis:host: localhost  # redis地址port: 6379redis_set_key: hke:heavy_query_setredis_one_key_prefix: hke:heavy_expr # heavy_query key前缀high_can_result_key: high_can_result_key
consul:host: localhost  #consul地址port: 8500consul_record_key_prefix: prometheus/records #  heavy_query key前缀# 所有采集的地址,用来取高基数
scrape_promes:- 1.1.1.1:9090- 1.1.1.2:9090- 1.1.0.0空的模板:9090- 1.1.1.4:9090heavy_blacklist_metrics:   # 黑名单metric_names- kafka_log_log_logendoffset- requests_latency_bucket- count(node_cpu_seconds_total)

ansible-copy拷贝日志文件

变量存放的yaml config.yaml

prome_query_log:prome_log_path: /App/logs/prometheus_query.log # prometheus query log文件pathheavy_query_threhold: 5.0                    # heavy_query阈值py_name: parse_prome_query_log.py            # 主文件名local_work_dir: /App/tgzs/conf_dir/prome_heavy_expr_parse/all_prome_query_log # parser拉取query_log的保存路径check_heavy_query_api: http://localhost:9090  # 一个prometheus查询地址,用来double_check记录是否真的heavy,避免误添加

执行拷贝的playbook

  • prome_heavy_expr_parse.yaml
  • 意思是将所有prometheus的query log 拷贝到本地目录下

- name:  fetch log and push expr to cachehosts: alluser: rootgather_facts:  falsevars_files:- config.yamltasks:- name: fetch query logfetch: src={{ prome_query_log.prome_log_path }} dest={{ prome_query_log.local_work_dir }}/{{ inventory_hostname }}_query.log flat=yes validate_checksum=noregister: result- name: Show debug infodebug: var=result verbosity=0

解析日志文件,查找重查询

解析文件 parse_prome_query_log.py

方法 parse_log_file

  • 代码
def parse_log_file(log_f):'''{"httpRequest":{"clientIP":"1.1.1.1","method":"GET","path":"/api/v1/query_range"},"params":{"end":"2020-04-09T06:20:00.000Z","query":"api_request_counter{job="kubernetes-pods",kubernetes_namespace="sprs",app="model-server"}/60","start":"2020-04-02T06:20:00.000Z","step":1200},"stats":{"timings":{"evalTotalTime":0.467329174,"resultSortTime":0.000476303,"queryPreparationTime":0.373947928,"innerEvalTime":0.092889708,"execQueueTime":0.000008911,"execTotalTime":0.467345411}},"ts":"2020-04-09T06:20:28.353Z"}:param log_f::return:'''heavy_expr_set = set()heavy_expr_dict = dict()record_expr_dict = dict()with open(log_f) as f:for x in f.readlines():x = json.loads(x.strip())if not isinstance(x, dict):continuehttpRequest = x.get("httpRequest")path = httpRequest.get("path")if path != "/api/v1/query_range":continueparams = x.get("params")start_time = params.get("start")end_time = params.get("end")stats = x.get("stats")evalTotalTime = stats.get("timings").get("evalTotalTime")execTotalTime = stats.get("timings").get("execTotalTime")queryPreparationTime = stats.get("timings").get("queryPreparationTime")execQueueTime = stats.get("timings").get("execQueueTime")innerEvalTime = stats.get("timings").get("innerEvalTime")# 如果查询事件段大于6小时则不认为是heavy-queryif not start_time or not end_time:continuestart_time = datetime.strptime(start_time, '%Y-%m-%dT%H:%M:%S.%fZ').timestamp()end_time = datetime.strptime(end_time, '%Y-%m-%dT%H:%M:%S.%fZ').timestamp()if end_time - start_time > 3600 * 6:continue# 如果两个时间都小于阈值则不为heavy-queryc = (queryPreparationTime < HEAVY_QUERY_THREHOLD) and (innerEvalTime < HEAVY_QUERY_THREHOLD)if c:continueif queryPreparationTime > 40:continueif execQueueTime > 40:continueif innerEvalTime > 40:continueif evalTotalTime > 40:continueif execTotalTime > 40:continuequery = params.get("query").strip()is_bl = Falsefor bl in HEAVY_BLACKLIST_METRICS:if bl in query:is_bl = Truebreakif is_bl:continue# avoid multi heavy queryif REDIS_ONE_KEY_PREFIX in query:continue# \r\n should not in query ,replace itif "\r\n" in query:query = query.replace("\r\n", "", -1)# \n should not in query ,replace itif "\n" in query:query = query.replace("\n", "", -1)# - startwith for grafana network outif query.startswith("-"):query = query.replace("-", "", 1)md5_str = get_str_md5(query.encode("utf-8"))record_name = "{}:{}".format(REDIS_ONE_KEY_PREFIX, md5_str)record_expr_dict[record_name] = queryheavy_expr_set.add(query)last_time = heavy_expr_dict.get(query)this_time = evalTotalTimeif last_time and last_time > this_time:this_time = last_timeheavy_expr_dict[query] = this_timelogging.info("log_file:{} get :{} heavy expr".format(log_f, len(record_expr_dict)))return record_expr_dict
  • 判断是否是 range_query ,instant_query不分析
            if path != "/api/v1/query_range":continue
  • 解析querylog中的耗时字段
  • 如果查询事件段大于6小时则不认为是heavy-query
            # 如果查询事件段大于6小时则不认为是heavy-queryif not start_time or not end_time:continue
  • 如果两个时间都小于阈值则不为heavy-query
            # 如果两个时间都小于阈值则不为heavy-queryc = (queryPreparationTime < HEAVY_QUERY_THREHOLD) and (innerEvalTime < HEAVY_QUERY_THREHOLD)if c:continue
  • 用dict和set去重,因为日志中可能有多行关于一个重查询ql的记录
            last_time = heavy_expr_dict.get(query)this_time = evalTotalTimeif last_time and last_time > this_time:this_time = last_timeheavy_expr_dict[query] = this_time
  • 将重查询ql的结果算md5作为key,ql作为value 返回

本节重点总结 :

  • pre_query项目配置文件设置
  • ansible-copy拷贝日志文件
  • 解析日志文件并判断重查询
关键字:万能浏览器有哪些_手机制作网站免费_网页制作成品模板网站_百度关键词优化服务

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com

责任编辑: