Releases: WebexCommunity/WebexPythonSDK
Simplify ApiError Messages
ApiError
messages are now shorter, more insightful, and easier to inspect. 🙌
We have simplified the default string interpretation of the ApiError
messages. The simplified messages will use the message
attribute of the returned JSON (if present) to provide more insight as to why the request failed and will default to the generic error descriptions from the API docs if a message
attribute is not available.
Example of the New Message Format:
ApiError: [400] Bad Request - Message destination could not be determined. Provide only one destination in the roomId, toPersonEmail, or toPersonId field
The ApiError
exceptions now have several attributes exposed for easier inspection:
response
- Therequests.Response
object returned from the API call.request
- Therequests.PreparedRequest
used to submit the API request.status_code
- The HTTP status code from the API response.status
- The HTTP status from the API response.details
- The parsed JSON details from the API response.message
- The error message from the parsed API response.description
- A description of the HTTP Response Code from the API docs.
To inspect an error, simply catch it in a try block and access the above attributes on the caught error:
from webexteamssdk import ApiError, WebexTeamsAPI
api = WebexTeamsAPI()
try:
api.messages.create()
except ApiError as error:
print(error.message)
See ApiError
in the API Docs for more details.
This enhancement addresses enhancement request #62 and resolves 🐛 #68.
Python2 compatibility bug - fixed
The new WebexTeamsDateTime
functionality had introduced a minor compatibility bug with Python v2. We squished it. 🐜💀
ciscosparkapi is now webexteamssdk!
With the name change from Cisco Spark to Webex Teams, ciscosparkapi
is now webexteamssdk
!
Don't worry! While it has received quite a bit of enhancing, the WebexTeamsAPI
wrapper works just like the CiscoSparkAPI
wrapper - only better:
- The Python objects returned by the APIs are now immutable, which means that you can include them in sets and use them as keys in dictionaries.
- Date-times returned by the Webex Teams APIs are now modeled as Python datetime's making them even easier to work with.
- The internal package structure has been overhauled to make way for adding new capabilities to the library.
- The core library code, test suite, and docs have been refactored and are now cleaner and leaner than ever.
We'll have more new capabilities to work-on and announce in the coming months, but this is a good start for now. 😎
Events API Support!
With this release, ciscosparkapi
now supports the Cisco Spark Events API! See the Events API and Event data model docs for more details.
Note: Compliance Officers may retrieve events for all users within an organization. See the Compliance Guide for more information.
feature: #55
Updated SparkData Classes & Fixed and Improved Rate-Limit Handling
In addition to fixing a bug ( #52 ) in the automated rate-limit handling and several testing improvements, this release also includes some substantial enhancements to the SparkData classes and how objects are created and returned by the API methods. 😎
New Features:
SparkData
classes (Room
,Person
,Message
, etc.) are now composed classes created by inheriting from theSparkData
base class and type-specific mixin classes. This makes it easier to create your own composed classes (like we are doing with the enhanced data objects in theciscosparksdk
package).- The
CiscoSparkAPI
class now accepts anobject_factory=
parameter that not surprisingly accepts an object factory function, which is responsible for creating the objects returned by the API methods. This allows you to easily create your own object classes and have them returned by theCiscoSparkAPI
methods. #EasilyExtensible 🔌 - Automated rate-limit handling now generates a custom
SparkRateLimitWarning
⚠️ warning message when a rate-limit response is received. If you want to know if your code is being rate-limited, you can easily catch and log warnings to see 🙈 what is going on.
Updated feature docs on the new object extensibility are coming soon. I wanted to go ahead and get the rate-limit fixes, and the initial object code out so the development can move forward on the ciscosparksdk package.
Happy Coding!
GeneratorContainer Slicing & More...
This update adds:
- Refactored
GeneratorContainer
's - now with slicing support! (for the #50 guys) - Only need the first 10 rooms, messages, or etc.? All of the package'slist()
methods (returnGeneratorContainers
) that can now be easily sliced to give you exactly what you need, for example:
api = CiscoSparkAPI()
rooms = api.rooms.list(type='group')
print("Here are the first ten rooms:")
for room in rooms[:10]:
print(room)
-
Squashed a Python2 SparkData Bug - If you have tried to initialize a SparkData object (like a webhook) using Python v2, it was probably raising a
TypeError
when you did. #SquishedIt 🐛 ☠️ -
Other Boring Stuff:
- Updated script metadata (copyright notices and etc.).
- Cleaned up some tests.
- PEP8 fixes.
- Blah, blah, blah... 💤
Patch for Rate Limit `retry_after` Bug
Merged in pull request #49 from @dlspano with a fix to the package's rate-limit handling support where we (me) had accidentally removed the SparkApiError.retry_after
attribute that is critical to handling the rate-limit messages. 🤦♂️ -Thank you for catching this Dave!
This release also includes a few minor commits that were added to the package in support of the ciscosparksdk work that is underway.
Expose Spark Data Object's JSON Data & Formalize Package API
A couple of small feature updates in this release:
- We are now exposing the Spark data object's JSON data in three formats ( #48 ): 💯
<spark data object>.json_data
returns a copy of the object's JSON data as anOrderedDict
.<spark data object>.to_dict()
returns a copy of the object's JSON data as adict
object.<spark data object>.to_json()
returns a copy of the object's JSON data as an JSON string. Note: You can pass your favorite Python JSON encoding keyword arguments to this method (likeindent=2
and etc.).
- We have refactored the
ciscosparkapi
main package to more clearly articulate what classes and data are being exposed to you for your use. 😎
Up-to-Date | Cleaner & Clearer Than Ever
All of the API wrappers and data models have been reviewed and updated to match the latest Cisco Spark API capabilities. -and- We have completed a significant internal restructuring to improve all of the API wrappers:
-
All API methods that accept parameters and post-data have been updated to consistently accept optional (
**request_parameters
) keyword arguments. So, if Cisco releases an API update tomorrow with some new awesome parameter... You can go ahead and use it. We'll update the code later so that it shows up in your IDE as soon as we can. -
New WebhookEvent - Webhook posts to your bot or automation can now be modeled via a new
WebhookEvent
. Just pass the JSON body that Spark posts to your web service to theWebhookEvent()
initializer and you can use native dot-syntax to access all of the attributes. -
Exceptions - some changes to what exceptions are raised...
- TypeErrors - If you happen to pass an incorrectly typed parameter to one of the API methods or object initializers, the package will now raise a more appropriate and informative
TypeError
rather than anAssertionError
. - ValueErrors - If you pass an incorrect value... You guessed it
ValueError
.
- TypeErrors - If you happen to pass an incorrectly typed parameter to one of the API methods or object initializers, the package will now raise a more appropriate and informative
Exception handling should be very straightforward now. The only exception that you should have to catch and handle at runtime should be the SparkApiError
's, which are returned when Cisco Spark responds with an error code. By the way, these were recently updated to show you the full request and response body, when an error occurs. Any other errors should show up and be addressed when you are writing and debugging your code.
Please open an issue if you experience any issues with the package. We have tested it extensively so hopefully you shouldn't! ...but the issue log is there just in case. 🤞 😎
-Thank You!
Enhanced SparkApiErrors with Request and Response details
Micro release with some goodness!
We corrected issue #46 where SparkApiError
s were not printing / displaying correctly, and we enhanced them while we were in correcting the issue. SparkApiError
s now include the full details of the request and response that triggered / caused the error. No more having to go to your debugger to see what the offending request and response looked like. 😎