Backend Development 8 min read

Tourism Comment Data Collection and Visualization Dashboard Using Python Flask

This article describes a Python Flask‑based system that crawls tourism site reviews, processes the data with pandas, stores it in MySQL, and visualizes sentiment analysis, keyword frequencies, and rating trends through interactive ECharts charts on a responsive web dashboard.

Python Programming Learning Circle
Python Programming Learning Circle
Python Programming Learning Circle
Tourism Comment Data Collection and Visualization Dashboard Using Python Flask

Background : Understanding tourist reviews is crucial for destination management and marketing; the project collects review data, performs sentiment analysis and keyword frequency analysis, and presents the results on a Flask‑based visualization dashboard to aid decision‑making.

Technology Stack : Flask framework, HTML/JS/CSS front‑end, MySQL 8.0, pandas, ECharts for visualization, and requests for web scraping.

Front‑end Design : The page follows HTML5 standards with a three‑column layout, includes dynamic particle effects, uses jQuery and ECharts for interactive charts, refreshes data via setInterval, and adapts to various screen sizes.

Flask Implementation (main code) :

class CorpData(SourceDataDemo):

    def __init__(self):
        """
        按照 SourceDataDemo 的格式覆盖数据即可
        """
        super().__init__()
        self.title = '旅游景点评论数据采集与可视化大屏'
        self.counter = {'name': '景点数', 'value': tj()[0]}
        self.counter2 = {'name': '评论数', 'value': tj()[1]}
        self.echart1_data = {
            'title': '套餐类型分析',
            'data': pinpai()
        }
        self.echart2_data = {
            'title': '不同景点评论数',
            'data': jiage()
        }
        self.echarts3_1_data = {
            'title': '是否VIP分析',
            'data': cpu_1()
        }
        self.echart4_data = {
            'title': '不同年份评论数对比',
            'data': [
                {"name": "数量", "value": xiaoliang()['数量']},
            ],
            'xAxis': xiaoliang()['年'],
        }
        self.echart5_data = {
            'title': '词频分析',
            'data': pm()
        }
        self.echart6_data = {
            'title': '评论数据',
            'data': biao()
        }
        self.map_1_data = {
            # 'symbolSize': 80000,
            'data': sheng()
        }

Crawler Implementation (main code) :

html = requests.post(posturl, data=json.dumps(request), headers=headers)
html1 = json.loads(html.text)
print('正在爬取第' + str(i) + '页')
items = html1['result']['items']
for k in items:
    try:
        pl = k['content']
        didian = k['ipLocatedName']
        zongping = k['publishTypeTag']
        pf1 = k['scores'][0]['name']
        pf1_score = k['scores'][0]['score']
        pf2 = k['scores'][1]['name']
        pf2_score = k['scores'][1]['score']
        pf3 = k['scores'][2]['name']
        pf3_score = k['scores'][2]['score']
        taocan = k['touristTypeDisplay']
        vip = k['userInfo']['userMember']
        print(pl, didian, zongping, pf1, pf1_score, pf2, pf2_score, pf3, pf3_score, taocan, vip)
        ws.append([pl, didian, zongping, pf1_score, pf2_score, pf3_score, taocan, vip])
    except:
        pass

Visualization Implementation (main code) :

tooltip: {
        show: true,
        formatter: function(params) {
            if (params.value.length > 1) {
                return '  ' + params.name + '   ' + params.value[2] + '热度  ';
            } else {
                return '  ' + params.name + '   ' + params.value + '热度  ';
            }
        },
    },
    geo: {
        map: 'china',
        show: true,
        roam: false,
        label: { emphasis: { show: false } },
        layoutSize: "100%",
        itemStyle: {
            normal: {
                borderColor: new echarts.graphic.LinearGradient(0, 0, 0, 1, [{
                    offset: 0,
                    color: '#00F6FF'
                }, {
                    offset: 1,
                    color: '#53D9FF'
                }], false),
                borderWidth: 3,
                shadowColor: 'rgba(10,76,139,1)',
                shadowOffsetY: 0,
                shadowBlur: 60
            }
        }
    },
    series: [{
        type: 'map',
        map: 'china',
        aspectScale: 0.75,
        label: { normal: { show: false }, emphasis: { show: false } },
        itemStyle: {
            normal: {
                areaColor: {
                    x: 0,
                    y: 0,
                    x2: 0,
                    y2: 1,
                    colorStops: [{ offset: 0, color: '#073684' }, { offset: 1, color: '#061E3D' }]
                },
                borderColor: '#215495',
                borderWidth: 1,
            },
            emphasis: { areaColor: { x: 0, y: 0, x2: 0, y2: 1, colorStops: [{ offset: 0, color: '#073684' }, { offset: 1, color: '#061E3D' }] } }
        },
        data: outdata,
    }, {
        type: 'effectScatter',
        coordinateSystem: 'geo',
        rippleEffect: { brushType: 'stroke' },
        showEffectOn: 'render',
        itemStyle: { normal: { color: { type: 'radial', x: 0.5, y: 0.5, r: 0.5, colorStops: [{ offset: 0, color: 'rgba(5,80,151,0.2)' }, { offset: 0.8, color: 'rgba(5,80,151,0.8)' }, { offset: 1, color: 'rgba(0,108,255,0.7)' }], global: false } } },
        label: { normal: { show: true, color: '#fff', fontWeight: 'bold', position: 'inside', formatter: function(para) { return '{cnNum|' + para.data.value[2] + '}'; }, rich: { cnNum: { fontSize: 13, color: '#D4EEFF' } } } },
        symbol: 'circle',
        symbolSize: function(val) {
            if (val[2] === 0) { return 0; }
            var a = (maxSize4Pin - minSize4Pin) / (max - min);
            var b = maxSize4Pin - a * max;
            return a * val[2] + b * 1.2;
        },
        data: convertData(outdata),
        zlevel: 1,
    }]
};
MySQLFlaskdata visualizationEChartsweb scrapingtourism
Python Programming Learning Circle
Written by

Python Programming Learning Circle

A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.