blog.gnip.com
Zach HoferShall, Head of Twitter Ecosystem - Gnip Blog - Social Data and Data Science Blog
https://blog.gnip.com/author/zachhofershall
Author: Zach Hofer-Shall, Head of Twitter Ecosystem. Zach Hofer-Shall is the Head of Twitter Ecosystem where he manages Twitter's B2B data and platform partners and manages the Twitter Certified Program team. Prior to Twitter, Zach was a Senior Analyst with Forrester Research covering social technology platforms, social strategy, and social intelligence. He resides in Oakland with his wife and twin boys. Working Directly With the Twitter Data Ecosystem. April 10, 2015. If you’re one of the companies stil...
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/tag/memory-optimization
Tag Archives: memory optimization. Enriching With Redis: Part 1. May 2, 2014. We’ve talked previously about how we use Redis to Track Cluster Health. For our Historical PowerTrack. Now we’ll be exploring how we enrich with Redis. For every activity that flows through Gnip, we add extra metadata that provides additional information that our customers are interested in. We call this extra metadata Enrichments, and it’s what allows customers to filter on URLs or Profile Location. But for a quick overview, t...
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/enriching-with-redis-part-1
Enriching With Redis: Part 1. May 2, 2014. We’ve talked previously about how we use Redis to Track Cluster Health. For our Historical PowerTrack. Now we’ll be exploring how we enrich with Redis. For every activity that flows through Gnip, we add extra metadata that provides additional information that our customers are interested in. We call this extra metadata Enrichments, and it’s what allows customers to filter on URLs or Profile Location. But for a quick overview, this enrichment takes the opaque, us...
gnip.com
Gnip - Historical
https://gnip.com/products/historical
Go Back In Time. Access the full archive of public Twitter data with our suite of Historical APIs. The Full Archive of Public Twitter Data. Historical PowerTrack provides filtered access to the full archive of public Twitter data, enabling you to find and analyze any public Tweet — back to the very first one in 2006. Instant Access to the Last 30 Days. The 30-Day Search API provides instant and complete access to the last 30 days of public Twitter data. Full Archive Search API. Company HQ Location *.
gnip.com
Gnip - Insights
https://www.gnip.com/insights
Rich Insights From Twitter Data. Our family of Insights APIs provides deep insights into audiences and content on Twitter, unlocking new business value for brands and the companies that serve them. The Audience API delivers aggregate information about audiences you define, making it easy to derive valuable insights about these audiences on Twitter. The Engagement API provides powerful access to impression and engagement data to help you optimize the full potential of your content strategy on Twitter.
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/tag/amazon-s3
Tag Archives: Amazon S3. Exploring S3 Read Performance. November 13, 2013. Jud Valeski, Co-Founder & CTO. This blog post is a collaboration between Jud Valeski and Nick Matheson. When you use Peta as the unit to measure your S3 storage usage, you’re dealing with a lot of data. Gnip has been a tremendously heavy S3 user since S3 came online; it remains our large-scale durable storage solution. One such approach is using HTTP byte range requests into our S3 buckets in order to access just the blocks of dat...
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/tag/database
Enriching With Redis Part II: Schema Happens. July 15, 2014. As a follow up to our first post about Redis at Gnip. We’ll discuss how we built on our basic client side Redis usage. Early on our Redis usage implementation was great. It was performant, we had failover, replication, and health checks. However a lot of our client code for. Redis had become pretty low level. For instance code to read or write an enrichment had to worry about, and implement things like: Which key is this going to live in? GeoEn...
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/author/ericwendelin
Building a Killer (Twitter) Search UI. December 16, 2013. Eric Wendelin, Lead Software Engineer. Gnip is primarily an API company, providing its customers with API access to the social data activities they request and putting the onus of processing, analysis and display of the data on said customer. In the case of our Search API. One request to Gnip’s Search Count API. Gives a nice history of frequency in the form of a line chart like this one. You can see how I configured Highcharts. And tweets support ...
gnip.com
Gnip - Realtime
https://gnip.com/realtime
The Data You Need,. The Moment You Need It. Get the data you need when you need it — right now — with our suite of Real-time APIs. All Signal, No Noise. PowerTrack offers powerful filtering and complete coverage of the data you need, delivered in real time. Real-time Trend Detection and Discovery. The Decahose delivers a 10% sample of Tweets – in real time – to help you uncover trends that can impact your business. Company HQ Location *. Bolivia, Plurinational State of. Bonaire, Sint Eustatius and Saba.
engineering.gnip.com
Engineering Blog – Big Data Engineering and Development Blog – Gnip
https://engineering.gnip.com/tag/amazon-s3-storage
Tag Archives: Amazon S3 Storage. Exploring S3 Read Performance. November 13, 2013. Jud Valeski, Co-Founder & CTO. This blog post is a collaboration between Jud Valeski and Nick Matheson. When you use Peta as the unit to measure your S3 storage usage, you’re dealing with a lot of data. Gnip has been a tremendously heavy S3 user since S3 came online; it remains our large-scale durable storage solution. One such approach is using HTTP byte range requests into our S3 buckets in order to access just the block...