-
Star
(255)
You must be signed in to star a gist -
Fork
(137)
You must be signed in to fork a gist
-
-
Save hugobowne/18f1c0c0709ed1a52dc5bcd462ac69f4 to your computer and use it in GitHub Desktop.
| class MyStreamListener(tweepy.StreamListener): | |
| def __init__(self, api=None): | |
| super(MyStreamListener, self).__init__() | |
| self.num_tweets = 0 | |
| self.file = open("tweets.txt", "w") | |
| def on_status(self, status): | |
| tweet = status._json | |
| self.file.write( json.dumps(tweet) + '\n' ) | |
| self.num_tweets += 1 | |
| if self.num_tweets < 100: | |
| return True | |
| else: | |
| return False | |
| self.file.close() | |
| def on_error(self, status): | |
| print(status) |
Innovation is changing how businesses operate, making services more efficient, accessible, and customer-focused than ever before. From automation to smart digital solutions, technology continues to create new opportunities for growth and convenience. If you're interested in exploring modern trends in this space, learn more here.
This looks like a Twitter streaming listener script where tweets are collected in real-time and saved into a file, which is useful for data analysis or building datasets. Since it’s based on an older Twitter API version, it might need updates to work properly with the current API changes.
In general, handling and structuring data efficiently is important in many digital projects, especially when building user-focused platforms. I’ve seen similar emphasis on structure and user experience in different industries too, like on modern lifestyle experience platforms such as Socio HK, where everything feels well-organized and focused on delivering a smooth experience.
Working with older Twitter API code like this is still useful for understanding how streaming data collection and listeners work, especially for learning or maintaining legacy projects. Writing tweets to a file in real time is a simple but effective approach for data logging and analysis.
Even in completely different industries, structured data and presentation matter a lot—similar to how GOSSiP Hong Kong focuses on delivering a well-organized and engaging user experience through its digital presence and brand identity.
I’ve worked with similar scripts before, and handling rate limits and API errors is really important for something like this. Adding proper logging also helps a lot when debugging issues later. I used a similar setup when integrating data feeds into a website, and small tweaks made it much more stable.