Lo-Fi Python

Sep 13, 2020

Delete All Your Tweets with Tweepy and the Twitter API

You may want to download an archive of your tweets before deleting them. I did this and it took about a day to get my archive download.

How To Purge Your Tweet History with Python

  1. Per the Tweepy library documentation, install tweepy with pip. It worked fine in my python 3.8 virtual environment.
pip install tweepy
  1. Sign up for a Twitter Developer account and create an app. I named mine "tweetcleanr".
  2. Find your app under "Projects & Apps". Edit your app's permissions to "Read + Write + Direct Messages".
  3. After you update your permissions, select the "Keys and tokens" tab. Then regenerate new API keys. Then paste them in the below script.
Twitter Dev UX
  1. Save the below script as a python file. In command prompt or terminal, run python delete_tweets.py or whatever you want to name it!
  2. You'll be asked to go to a link and enter an authorization code. Then you'll see your tweets being deleted like pictured below.

delete_tweets.py

I found this Github Gist via Google and updated the print and input statements to Python 3. I also added the traceback module in case you need to debug it. Initially, I received an error telling me to complete step 3 above. I didn't see the error message at first, until adding traceback.print_exc() like you see below.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
import tweepy
import traceback

"""Delete All Your Tweets - Github Gist by davej
Credit: https://gist.github.com/davej/113241
Ported to Python 3 by Lo-Fi Python: https://lofipython.com/delete-all-your-tweets-with-tweepy-and-the-twitter-api/
"""
CONSUMER_KEY = "get_from_dev_portal"
CONSUMER_SECRET = "get_from_dev_portal"


def oauth_login(consumer_key, consumer_secret):
    """Authenticate with twitter using OAuth"""

    auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
    auth_url = auth.get_authorization_url()

    verify_code = input(
        "Authenticate at %s and then enter you verification code here > " % auth_url
    )
    auth.get_access_token(verify_code)

    return tweepy.API(auth)


def batch_delete(api):
    print(
        "You are about to delete all tweets from the account @%s."
        % api.verify_credentials().screen_name
    )
    print("Does this sound ok? There is no undo! Type yes to carry out this action.")
    do_delete = input("> ")
    if do_delete.lower() == "yes":
        for status in tweepy.Cursor(api.user_timeline).items():
            try:
                api.destroy_status(status.id)
                print("Deleted:", status.id)
            except Exception:
                traceback.print_exc()
                print("Failed to delete:", status.id)


if __name__ == "__main__":
    api = oauth_login(CONSUMER_KEY, CONSUMER_SECRET)
    print("Authenticated as: %s" % api.me().screen_name)

    batch_delete(api)
Python Script Deleting Tweets

Twitter Cleanse Complete

Twitter has a really slick developer dashboard. Its API combined with the tweepy library got the job done for me. It's great when stuff just works. And it only cost me about 1 hour to complete. Time to start a clean slate. Here's to looking forward.

Supplementary Reading

Tweepy Documentation Tutorial

Twitter's API Tutorials

Twitter Postman Tutorial

May 18, 2020

A Guide To Making HTTP Requests To APIs With JSON & Python

This contains all of my best API-related knowledge picked up since learning how to use them. All APIs have their own style, quirks and unique requirements. This post explains general terminology, tips and examples if you're looking to tackle your first API.

Here's what is covered:

  1. API & HTTP Lingo You Should Know
  2. Testing and Exporting Python Request Code from Postman (Optional)
  3. Formatting Your Request
  4. Example GET and POST Requests
  5. "Gotchyas" To Avoid
  6. Sidebar: requests.Session()
  7. Dig deeper into requests by raising your HTTPConnection.debuglevel
Terminology Clarification: I will refer to "items" or "data" throughout this post. This could be substituted for contacts or whatever data you are looking for. For example, you might be fetching a page of contacts from your CRM. Or fetching your tweets from Twitter's API. Or searching the Google location API, you might look up an address and return geo-location coordinates.

API & HTTP Lingo You Should Know

Hypertext Transfer Protocol (HTTP)

Per Mozilla, "Hypertext Transfer Protocol (HTTP) is an application-layer protocol for transmitting hypermedia documents, such as HTML. It was designed for communication between web browsers and web servers, but it can also be used for other purposes. HTTP follows a classical client-server model, with a client opening a connection to make a request, then waiting until it receives a response."

HTTP: you = client. API = way to communicate with server

Application Programming Interface (API)

Per Wikipedia, the purpose of an API is to simplify "programming by abstracting the underlying implementation and only exposing objects or actions the developer needs."

Representational State Transfer (REST)

REST is an architectural style of web APIs. It is the dominant architecture that many APIs use. Simple Object Access Protocol (SOAP) is another style I've heard of, but it seems less common nowadays.

A REST API is built for interoperability and has properties like: "simplicity of a uniform interface" and "visibility of communication between components by service agents." [Wikipedia] If an API follows REST, it has many good principles baked in.

GET, POST and PATCH

These are three common types of request methods.

  • GET: Read data returned, such as all of your tweets in the Twitter API.
  • POST: Create a new item, like writing a new tweet. Can also update existing data. Tweets aren't editable though!
  • PATCH: Similar to POST, this is typically used for updating data.

URL or "Endpoint"

This is the website target to send your request. Some APIs have multiple endpoints for different functionality.

URL Parameters

Values you pass to tell the API what you want. They are defined by the API specifications, which are usually well documented. In Python's requests library, they may be passed as keyword arguments. Sometimes they are passable directly within the endpoint url string.

Body or "Payload"

To make a request, you send a payload to the url. Often this is a JSON string with the API's URL parameters and values, AKA the request body. If the API is written specifically for Python, it might accept an actual Python dictionary.

Javascript Object Notation (JSON)

JSON is the data interchange standard for all languages. Usually it is the default way to pass data into and receive data from an API. If making a POST, you can check your json object is formatted correctly by using a json linter. Or try Python's json.tool! You can also pretty print your JSON or python dictionary with the pprint module. If you're using json.dumps remember it has pretty printing accessible by keyword arguments! These features are accessible in the standard library. Isn't Python great? See also: Python 101 - An Intro to Working with JSON

Pages

API data is commonly returned in multiple pages when there is a lot of data returned. Each page can be accessed one request at a time. Sometimes you can specify how many items you want on a page. But there is usually a maximum items per page limit like 100.

Status Code

Each request usually gives you a numeric code corresponding to happened when the server tried to handle your request. There is also usually a message returned.

Headers

These usually contain website cookies and authorization info. They also may tell the API what kind of data you want back. JSON and XML are the two most common types of data to return. You can specify the return format in the content-type headers.

If you need to parse an XML response, check out Python's stock ElementTree API. I've only seen a few APIs using XML responses, such as the USPS Address Validation API.

Authorization

Authorization varies widely. This is the level of identification you need to pass to the API to make a request. Public APIs might require none. Some just need a username and password. Others use the Oauth standard, which is a system involving credentials and tokens for extra security.

Authorization Scheme Example [Mozilla]

Authorization: <auth-scheme> <authorisation-parameters>

1
2
# headers python dict example
headers = {"Authorization": f"basic {token}"}

Creating the Request JSON

I recommend using Postman in most cases, depending on the complexity of the API. If the JSON syntax is straightforward, you can format your data as a python dictionary, then convert it to a JSON object with json.dumps from the standard library's json module. But JSON can be tricky sometimes. You may also need to pass a dictionary of HTTP headers.

Some APIs have "Postman Collections", a set of Python (or any language) script templates for the API. In those cases, it might make sense to use those resources.

Path One: Make HTTP request with json & requests libraries

Format Python dict with json.dumps from the standard library's json module. Infer API requirements from documentation. Use requests for HTTP.

Path Two: Make HTTP request with Postman & requests library

Use Postman to generate the JSON payload. Plug headers and payload into requests. Use requests library for HTTP.

Postman has a friendly interface for plugging in all your pieces and tinkering with your request body until it works. Make it easier on yourself and use Postman, especially if there are collections. An alternative is to troubleshoot in Python if you are confident in your grasp of the API. I use both options depending on my familiarity with the API at hand.

Formatting Your Request

  1. Once you have the request working, you may export your Postman request to almost any language. For Python, you can sometimes export to the requests, http.client or urllib libraries. Hit the "code" button in Postman and then copy your code.
  2. Paste your Postman headers, payload and url into your existing code.
  3. You may want to use a dict or string formatting to pass values to your request parameters or url.
  4. If the API uses a token or other form of authorization that needs to be refreshed intermittently, I usually have a function that returns a token. token = fetch_token() Then put the token in the headers dict. {"Authorization": f"basic {token}"} Finally pass your headers and payload to your requests.get, requests.post, or requests.request function along with the endpoint url. You're now ready to test the request.

If you choose not to use Postman, you can use the json library. See the use of json.dumps() to convert a dictionary to a JSON object in example #2 below.

Python Installation

You can install requests with pip. Alternatively, http.client is included within the Python standard library. If you want to convert HTTP response data to a dataframe or csv, install pandas.

python -m pip install requests
python -m pip install pandas

Example #1: GET the geolocation details of any public location with the Google API

This was modified from another example of Google's Geolocation API. To use this, you need to create a developer account with Google and paste your API keys below.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
import requests


# Find the best double-cheeseburger + fries $7 can buy.
payload = {"key": "Add_Google_API_Key_Here", "address": "Redhot Ranch"}
url = "https://maps.googleapis.com/maps/api/geocode/json"
# Optional: set a 5 second timeout for the http request.
r = requests.get(url=url, params=payload, timeout=5)
print(r.text)
print(r.status_code)
data = r.json()

# Extract the latitude, longitude and formatted address of the first matching location.
latitude = data["results"][0]["geometry"]["location"]["lat"]
longitude = data["results"][0]["geometry"]["location"]["lng"]
formatted_address = data["results"][0]["formatted_address"]
print(longitude)
print(latitude)
print(formatted_address)

# Optional: convert response into a dataframe with pandas.
# import pandas as pd
# location_df = pd.json_normalize(data['results'])
# location_df.to_csv('Locations.csv')

Above you can see:

  • requests makes it easy to see the server's text response also with response.text
  • requests also makes JSON encoding easy with response.json()
  • pd.json_normalize() is convenient to convert the response dictionary to a dataframe.

Example #2: Encode a Python dictionary to json string and POST to a hypothetical API

  1. Create a dictionary with request body data and pretty inspect it with pprint.
  2. Encode the json string with json.dumps from the standard library's json module.
  3. POST the encoded JSON to the endpoint url with requests.
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
import pprint
import json
import requests


def dict_to_json_data():
    """Create request body with fictional contact details."""
    payload = {
        "first_name": "P",
        "last_name": "Sherman",
        "address": "42 Wallaby Way",
        "address_2": "",
        "city": "Sydney",
        "state": "NSW",
        "country": "AU",
        "zip": "2000",
    }
    pprint.pprint(payload)
    json_str = json.dumps(payload, ensure_ascii=True)
    # Optional: encode json str to utf-8.
    return json_str.encode("utf-8")


def post_data(json_str):
    """This is a fictional API request that passes a json object to requests.
    It decodes the server response with response.json() and
    Returns dictionary value by calling the data's keys.
    """
    headers = {
        "Authorization": f"Bearer {token}",
        "Content-Type": "application/json",
        "cache-control": "no-cache",
    }
    r = requests.request(
        method="POST",
        url="https://SomeSoftwareAPI.com/people/",
        data=json_str,
        headers=headers,
    )
    data = r.json()
    print(data.keys())
    # Call dict keys to get their values.
    contact_id = data["contact_id"]
    return contact_id


json_str = dict_to_json_data()
contact_id = post_data(json_str)

requests.request keyword argument alternatives for passing data

params – (optional) Dictionary, list of tuples or bytes to send in the query string for the Request.

data – (optional) Dictionary, list of tuples, bytes, or file-like object to send in the body of the Request

json – (optional) A JSON serializable Python object to send in the body of the Request

[requests API documentation]

"Gotchyas" To Avoid

  • Status codes are your friend. They offer a hint at why your request is not working. If you see 200 or 201, that's a good sign. They're usually helpful, but sometimes they can be misleading.
  • Ensure you are defining the correct content-type. I had an experience where Postman defined two conflicting content-type headers and it caused my request to fail. The server's error message indicated the problem was in my JSON, so it took me a while to figure out the headers were the problem.
  • Sometimes it makes a difference if your url has http:// vs. https:// in it. Usually https:// is preferred.

Sidebar: requests.Session()

You might be able to improve performance by using a requests "session" object.

1
2
3
4
5
6
7
8
9
import requests


# A session adds a "keep-alive" header to your HTTP connection + stores cookies across requests.
s = requests.Session()
for page in range(0, 2):
    url = f"https://exampleapi.com/widgets/{str(page)}"
    r = s.get(url)
    print(r.text)

Dig deeper into requests by raising your HTTPConnection.debuglevel

HTTPResponse.debuglevel: A debugging hook. If debuglevel is greater than zero, messages will be printed to stdout as the response is read and parsed. Source: http.client Python Docs
1
2
3
4
5
6
7
8
9
from http.client import HTTPConnection
import requests


HTTPConnection.debuglevel = 1
payload = {"key":"Add_Google_API_Key_Here", "address":"90 Miles"}
url = "https://maps.googleapis.com/maps/api/geocode/json"
r = requests.get(url=url, params=payload, timeout=5)
print(r.text)

Web Server Gateway Interface (WSGI, pronounced "Wis-Ghee")
"As described in PEP3333, the Python Web Server Gateway Interface (WSGI) is a way to make sure that web servers and python web applications can talk to each other." Gunicorn is one of a few Python WSGI clients. web2py is another WSGI client and web framework I have used.

Conclusion

I remember APIs seemed mysterious and daunting before I had used them. But like all things, they can be conquered with knowledge, understanding and tenacity to keep trying until you figure it out. Good luck!

Requests Documentation

requests.request() API documentation

requests.get() API documentation

requests.post() API documentation

Supplementary Reading

Google's HTTP Timing Explanation

List of Interesting "Unofficial" APIs

Proxy servers

Making 1 million requests with python-aiohttp

Nginx

Create, read, update and delete (CRUD)

Dec 21, 2019

Copying a pandas Dataframe to Google Sheets with pygsheets

Disclaimer: This endeavor was before I discovered AppScript, which may be an alternative solution to using pygsheets or other python libraries. pygsheets is interesting, but it could be a stretch to justify using it for something that could be done with AppScript. Both are ways to solve a problem by automating Google Sheet operations.

This was done on the Windows 7 OS. First, install libraries with pip. Enter in command prompt or terminal:

python -m pip install pandas
python -m pip install numpy
python -m pip install pygsheets

After installing necessary libraries, follow the steps documented by pygsheets:

  1. Create a Google Developer Account at console.developers.google.com
  2. Enable Sheets API to account
  3. Enable Drive API to account. Same as last step, but search for Drive.
  4. Create a Client Secret json file. Select "Credentials" tab, and "Create Credentials". Select Client Secret from options. Export from console and place in same directory as your .py file.
  5. Create a Service Account json file by selecting it instead of "Client Secret".
  6. Authorize pygsheets with your json files. See below.
  7. Copy spreadsheet to Google Sheet with pandas and pygsheets. See below.

After completing the first 5 steps, import pygsheets and authorize your account with the client secret json file:

1
2
import pygsheets
gc = pygsheets.authorize(client_secret='path/to/client_secret[...].json')

You will be prompted by the terminal to go to a hyperlink in a browser, get your authorization code, and enter that authorization code into the terminal.

Now, import both libraries needed and switch to authorize with your service json file. Then, load the csv to a dataframe with pandas. Finally, copy it to an existing Google Sheet with pygsheets:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
import pygsheets
import pandas as pd

"""Select worksheets by id, index, or title."""
gc = pygsheets.authorize(service_file='path/to/service_account_credentials.json')
sh = gc.open('add_google_sheet_name_here')
wks = sh.worksheet_by_title('add_sheet_tab_name_here')

"""Set a pandas dataframe to google sheet, starting at 1st row, 1st column"""
df = pd.read_csv('Example_Data.csv')
wks.set_dataframe(df,(1,1))

[Example] Split and upload a sheet with 40 columns

Google Sheets limits importing to 26 columns and 1,000 rows at a time. So you'll have to load the sheets in chunks if you have more than that. This approach uses numpy's array_split:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
import pygsheets
import pandas as pd
import numpy as np

gc = pygsheets.authorize(client_secret='path/to/client_secret[...].json')
sh = gc.open('add_google_sheet_name_here')
wks = sh.worksheet_by_title('add_sheet_tab_name_here')
df = pd.read_csv('Data_to_GSheets.csv')

# split columns into two dataframes with numpy and pandas
first_half_cols, second_half_cols = np.array_split(df.columns, 2)
first_half = df[first_half_cols]
second_half = df[second_half_cols]

# set both dataframes side-by-side in Google sheet
wks.set_dataframe(first_half,(1,1))
start_column = first_half.shape[1]
wks.set_dataframe(second_half,(1, start_column))

Conclusion

I found the terminal error messages from pygsheets to be very helpful while debugging the above. This module offers many other nifty spreadsheet operations. Solid library. You can now create and edit Google Sheets with Python.

AppsScript should probably be the default tool when working with Google Sheets because it is built in, but Python does have tools available to work with Google Sheets.

Resources

pygsheets Github

pygsheets Documentation

Google Sheets Documentation

pandas Documentation